One of the many challenges faced by scholars is to identify the impact their scholarship is having in their respective disciplines. This need is frequently driven by promotion and tenure. Alternatively, they may be interested in locating the most important research being done in their field. What are the important ideas now shaping opinions, methods, and research? The impact of published scholarship is also important for institution-level evaluation conducted by university and college administrations and for collection building and maintenance projects conducted by the library.
Traditionally, journal impact factors have been utilized to calculate the impact of scholarship. The impact factor came into use during the 1970s through the work of Eugene Garfield. Calculated as the average number of cited articles divided by the number of citable items in a journal in the past two years, the impact factor illuminates which journals are most influential in a scholar’s given field. In other words, it is a measure of how often a journal is cited by other journals in a field.
The journal impact factor is still used today as a measure of the relative importance of a journal within its field. However, there has been growing dissatisfaction with reliance upon the traditional journal impact factors due to the time lag in assessment that they impose since journals take time to accrue citations, the prospect of gaming the system, and the risk of elevating the rank of an article that is cited as a cautionary or corrective tale to the scholarly community. As a result different metrics have emerged including author and article level metrics and alternative metrics, which rely upon social media, reference/bibliography creation software and other non-traditional citations. Additionally, tools are available to researchers to create profiles that track their individual impact and correctly and consistently identify what research outputs are the product of their efforts.
This section of the Toolkit does not provide guides to finding and using the variety of traditional journal, article and author level metrics. Rather this section provides librarians with resources to assist them in thinking critically about traditional metrics to help facilitate conversations with the scholars they serve about the role of impact factors in evaluating scholarship, whether for tenure and promotion or for finding and evaluating research in one's field. This section will also provide tools and resources on the growing field of altmetrics as well as tips and resources for assisting scholars with maximizing the identification and impact of their research.
The following resources offer a critical look at traditional measurements of impact
In the past, peer-reviewed journals have been the standard for measuring the quality and importance of scholarly literature. However, the growing number of published journals and articles has placed a significant strain on this process. Part of the problem is attributable to a publication rate far outpacing the pool of available referees. With scant incentives available for referees in the form of credit in the tenure and promotion process, this shortage is expected to continue.
Despite its strong track record in filtering research, the peer review system has shown signs of weakness. The authors of the altmetrics manifesto point out that “peer review…is slow, encourages conventionality, and fails to hold reviewers accountable…[and] given that most papers are eventually published somewhere, peer-review fails to limit the volume of research.” This prognosis is not good and is unsustainable over the long term. In response, new alternative forms of peer review have begun to develop that take a more open and transparent approach to the traditional peer review model:
Open Peer Review: Open peer review begins with an author or editor posting an unpublished manuscript online in a comment-enabled web environment, inviting peers to comment and criticize. The argument is that open peer review allows for transparency in the peer review process and enables a wider variety of input, including cross-discipline critique and more technical, "non-scholarly" input.
Post-Publication Peer Review:
As more information is disseminated electronically, researchers will come to interact with that information on the open web in a variety of ways and through many different platforms and media types. Altmetrics measures how many times a journal article is downloaded, shared, commented on, and cited in social media outlets and can provide a meaningful indicator of the impact an article has among different user populations. One of the founders of the study of altmetrics Jason Priem and Bradley Hemminger have compiled a list of sources from which data can be collected and relayed back to scholars as meaningful impact data. These sources fall into seven categories:
Culling usage data from various social sites has several advantages. First, sites which have open APIs (application programming interface) can be accessed immediately for up-to-date usage statistics – an advantage over traditional citations. Second, a growing number of commercial platforms such as ImpactStory, Altmetric.com, and Plum Analytics allow scholars to track usage of their works across several research blogs, journals, and user populations. This allows a granularity of data and presents a broader perspective of overall impact. It is important to note that altmetrics is not meant to replace traditional citation, but is best used in conjunction with citations for an overall picture of scholarly impact.
Additional Readings, Resources and Presentations on Altmetrics:
There are a growing number of tools available to scholars to maximize their impact, whether through proper identification of themselves as the creator of a product of research or through social media like profile pages highlighting their expertise and output. Libraries can assist scholars in maximizing their impact by promoting author identification tools such as ORCID, sharing data on open access publishing and citation rates, and directing scholars to sites that allow them to develop their own social media like researcher profile pages.
Tips and Tools for Maximizing Impact
Using Open Access to Increase Impact
Portions of this section of the Toolkit are adapted from an essay that was originally written by Bruce Runnels for a course in scholarly communications at the School of Library & Information Science at IUPUI with subsequent amendment by Jen Waller of Miami University (Ohio), Kevin Smith from University of Kansas, Steven R. Harris from the University of Nevada, Reno, and Christine Fruin from the University of Florida.