The 2013 Edition of Journal Citation Reports® (JCR) recently became available. The annual compilation provides citing journal metrics as part of the network of Web of Science. More than 10,800 of the world's most highly cited, peer reviewed journals in 232 disciplines are included in this edition, published by nearly 2,500 publishers and representing 83 countries, with 379 journals receiving their first Journal Impact Factor.
Eugene Garfield, PhD described his early work in 1955 on the idea of an impact factor for science (JAMA commentary in 2006). His ideas and the company he founded for the work evolved, from print indexes to Web of Science and other research productivity work of its present owner, Thomson Reuters. The scholarly publishing world continues to seek ways to measure the publishing patterns and impact of individual authors and individual articles.
In “Library Notes” (#62, March 2010), Biosciences and Bioinformatics Librarian, Pamela Shaw wrote about the H-index as a means of measuring the impact of an author’s body of works. Pamela identifies the means for finding the H-Index, both in Web of Science and other sources.
Another measure that has been receiving interest is SNIP (Source-Normalized Impact per Paper) and Leiden University's modified SNIP, which “measures contextual citation impact by ‘normalizing’ citation values”. This is the measure used by Scopus.
Contributors to the Society for Scholarly Publishing (SSP) blog, “The Scholarly Kitchen,” have shared insights about alternative metrics, the study and use of non-traditional scholarly impact measures. The past president of SSP, Judy Luther, aptly entitled her article of a year ago, “Altmetrics – Trying to Fill the Gap”. The term "altmetrics" has only been around since 2009/2010, when Jason Priem, a doctoral candidate at the School of Information and Library Science at the University of North Carolina at Chapel Hill, first used it in a tweet. View a podcast about altmetrics that features Jason Priem, originator of the term “altmetrics” and co-founder (with Heather Piwowar) of Impact Story.
Priem also published an article describing the altmetrics collection in PLoS One in 2012 and Piwowar wrote an article entitled: Altmetrics: Value all research products (Nature. 2013 Jan 10). A response to the Piwowar article, Altmetrics: Too soon for use in assessment (Nature. 2013 Feb 14) offers an opinion and another sample foray poses a question: Do altmetrics work? Twitter and ten other social web services (PLoS One, 2013 May). There is an online group site to pull together discussions on approaches to the assessment of scholarly impact based on new metrics, and Dario Taraborelli, the group’s site owner offers this explanation:
“altmetrics go beyond traditional citation-based indicators as well as raw usage factors (such as downloads or click-through rates) in that they focus on readership, diffusion and reuse indicators that can be tracked via blogs, social media, peer production systems, collaborative annotation tools (including social bookmarking and reference management services).”
Will altmetrics be used by academically based or other authors, for showing to their peers and employers the value, impact, and recognition of their work? An example of how one faculty member has done that was offered in a June 3, 2013 Chronicle of Higher Education article by Linda Howard entitled “Rise of 'Altmetrics' Revives Questions About How to Measure Impact of Research.”
Article-Level Metrics (ALMs) are emerging as important tools to quantify how individual articles are being discussed, shared, and used. The Scholarly Publishing & Academic Resources Coalition (SPARC) Primer defines ALMs in the following way:
ALMs aggregate a variety of data points that collectively quantify not only the impact of an article, but also the extent to which it has been socialized and its immediacy. ALMs pull from two distinct data streams: scholarly visibility and social visibility. ALMs provide different markers of an article’s reach, beyond just citations. They can incorporate shorter term data points such as news coverage, blog posts, tweets, and Facebook likes. They can also include longer term markers such as download statistics and article comments. Taken collectively, these data points can present a much fuller perspective of an article’s impact.
This is just a sampling of some trends in the productivity measures arena. Some resources are freely available, while other products are fee-based. There are no crystal balls and the jury on the most effective productivity measures is still out. The dialog - and the debate - continue.
Related Blog Entries
Finding Cited References, an Author's h-index, and Journal Impact Factors (October 22, 2010)
Author Name Disambiguation Through the ORCID Initiative (January 27, 2010)
How do I Find a Journal's Impact Factor? (May 29, 2009)
Updated: September 25, 2023