Altmetrics for Librarians: Pros and Cons

From UBC Wiki
Jump to navigation Jump to search


While digital libraries, institutional repositories, journals and databases provide an open opportunity to download scholarly research, “alternative metrics” or altmetrics supply usage statistics, which can be very useful in determining an article’s popularity and its reading potential.[1] “We may be witnessing a tipping point in collaboration, faster access, and new opportunities.”[2] Almetrics allows librarians to provide their users with statistics regarding academic articles more quickly. Altmetrics can tell us how many times an article, website, software or blog has been viewed, downloaded, reused, shared, and cited.[3] It has been noted that there is correlation between the number of online views and downloads of an article and the number of times that article will be cited in future research.[4]

The pros of altmetrics

Altmetrics, which are generated a lot faster than bibliometrics, are a way for academic libraries to prove to stakeholders how successful and valuable their online repositories are, as well as how many faculty and students they serve. Altmetrics let librarians know how their collections are received in the “real world,” a helpful measure when it comes to collection management.[5]

Almetrics further provide librarians the ability to assess articles better and so not only provide more tailored, specific searches, but also demonstrate how popular the institution’s research is in a particular field.[3] “Almetrics could also clearly be used in the context of the librarian being able to offer insights to their research community, and give guidance on how to maximise the success of their own research efforts.”[6] Indeed, even institutional research could be filtered and redirected according to readership preferences.

And altmetrics are designed to be easy to use. Librarians, researchers, and other interested parties do not have to spend hours learning a new cumbersome, proprietary system.

Altmetric tools, both open source and proprietary, provide economic incentives for use and greater data granularity. For example, Altmetric Explorer is free for librarians at an academic institution. While other analysis tools, such as the impact factor (IF), measure only a journal’s impact as a whole, Altmetric Explorer measures the individual repercussion of each article within a journal or repository.[7] The Explorer can browse data found in hundreds of thousands of blog posts, tweets and other social media tools, and then filter that information and measure levels of attention over time to give authors a unified perspective of how popular their work is online.[7]

In addition to the number of times that a particular journal is cited, traditional measures emphasize the prestige aspect of that particular journal.[8] Such measures are now being called into question as the research community discovers that “Higher journal impact factors [are] correlated with a higher range of retractions due to fraud.” [9, p9] Moreover, as noted by the authors of the Altmetrics Manifesto, because JIF algorithms are unknown, they are vulnerable to manipulation.[1] Altmetrics can serve as a countermeasure to such problems.

Altmetrics can also encourage researchers to take a more active role in promoting collaborative problem solving and a broader understanding of their work. Leveraging social media and data concerning research consumption is becoming necessary for libraries, other academic institutions, and individual researchers as more people publish independently. “Combined, the exploitation of social media as a collaboration tool for research publication and dissemination is likely to change the current landscape in academic publishing with new ways of finding solutions, contacts, and content being created by the users themselves.” [2, p2]

The cons of altmetrics

Academics, including librarians, exist in a stressful climate of “publish or perish.” Altmetrics, by including social media data in measuring a document or scholar’s impact, have the potential to exacerbate the intense pressure. Lockley & Carrigan (2011) comment that the “continual publishing” increases the size of the “academic footprint.”[10] Scholars will be in a perpetual state of publishing, which leads to other concerns - privileging quantity over quality of publication, letting the focus on publishing become detrimental to other academic roles like teacher and mentor, and creating a massive virtual popularity contest. Altmetrics may encourage usage of a tool simply for the sake of using it, risking a glut of duplication that will artificially inflate or deflate data. There is a danger of information overload on a massive scale.

Continuous publication, especially the kind that is not mediated by a peer-review process like that found in the traditional journal, will create a staggering amount of open-access information. While in many respects a good thing, it also presents problems of assessing document reliability. Quantifying impact by including citations, tweets, and mentions online could be helpful, but it also privileges group thinking, which is not always the best approach, as discussed at length by Susan Cain.[11] Moreover, just as “gaming” is possible with traditional metrics,[1, p1] so “manipulation” is possible with altmetrics.[15, p2] For example, since article abstracts are not as carefully monitored as the articles themselves, they are more vulnerable to “spin” that leads to increased media attention.[9, p9] In fact, Yavchitz and his colleagues (2012) have shown [cited by 9, p9] that “the abstract (40% with spin) was the strongest correlate of spin in the press release (47% with spin).” Web 2.0 is supposed to be about individual agency and interactivity, but altmetrics could create an environment of too many options (number of tools available) and too few (privileging a few types of interaction).

Another problem posed by altmetrics is that because they are newcomers to the world of impact studies, much of the infrastructure required for these tools to function optimally is still in the construction process. DOIs and PubMed IDs are needed for optimal tracking, but often papers lack these.[16] Researchers are now acquiring ORCID identifiers, a unique ID for a particular researcher; however, even with these, researchers may still require a DOI.[17] Altmetrics still do not include tools for measuring media citations, private citations (such as through emails), and web hits and downloads outside of the Public Library of Science (PLoS).[17] Nor has a standardized method of recording qualitative data about research impact been developed.[17] Altmetric users would do well to remember that this is a young field of study and needs more solidification and examination before widespread implementation.


1. Priem J, Taraborelli D, Groth P, Neylon C. Altmetrics: a manifesto. October, 2001 [cited 1 Feb 2013]. Available from:

2. Yeong CH et al. Altmetrics: the right step forward. 3 May 2012 [cited 2 Feb 2013]. Biomedical Imaging and Intervention Journal. Available from:

3. Konkiel S. Altmetrics: An app review [presentation]. Indiana University. 2012. [cited 1 Feb 2013]. Available from:

4. Priem J, Piwowar, H, Hemminger B. 2012. Altmetrics in the wild: using social media to explore scholarly impact. [cited 1 Feb 2013]. Available from:

5. Konkiel S, Noel R. Altmetrics and librarians: how changes in scholarly communication will affect our profession. May 7, 2012 [cited 1 Feb 2013]. Presented at Indiana University Libraries In-House Institute. Available from:

6. Galligan F. Almetrics for librarians and institutions: part II.” 31 August 2012. [cited 2 Feb 2013]. Available from:

7. Altmetric. 2012. The Altmetric Explorer (official website information). Available from:

8. Carpenter T. Altmetrics—Replacing the impact factor is not the only point. 14 Nov 2012 [cited 1 Feb 2013]. In: Anderson, K. and Davis, P. The Scholarly Kitchen: What’s Hot and Cooking in Scholarly Publishing [Blog]. Wheat Ridge, Colorado: Society for Scholarly Publishing. c2010- . [about 3 screens]. Available from:

9. Silver J. Concealment, lying, and exaggeration in research. Journal Watch: Medicine That Matters. 2013 Jan: 19 (1), 9.

10. Lockley P, Carrigan M. Impact of Social Sciences [Internet]. London (UK): London School of Economics and Political Science; 2011 - . Continual publishing across journals, blogs and social media maximizes impact by increasing the size of the ‘academic footprint.’ 2011 October 26 [cited 2013 February 2]. Available from:

11. Cain S. Quiet: the power of introverts in a world that won’t stop talking. New York, NY: Crown; 2012.

12. Galligan F. Swets blog [Internet]. [place unknown]: Fin Galligan; [date unknown] - . Altmetrics for librarians and institutions: part I; 2012 August 29 [cited 2013 February 2]. Available from:

13. Galligan F. Swets blog [Internet]. [place unknown]: Fin Galligan; [date unknown] - . Altmetrics for librarians and institutions: part III; 2012 September 4 [cited 2013 February 2]. Available from:

14. Priem J. Impact of Social Sciences [Internet]. London (UK): London School of Economics and Political Science; 2011 - . As scholars undertake a great migration to online publishing, altmetrics stands to provide an academic measurement of twitter and other online activity. 2011 November 21 [cited 2013 February 2]. Available from:

15. Anyangwe E. Twitter, peer review and altmetrics: the future of research impact assessment. 19 Sept 2012 [cited 1 Feb 2013]. In: Higher Education Network: Ideas, Insight, and Debate from the Global Higher Education Community [Blog]. London: Guardian News and Media. c2011- . – [about 2 screens]. Available from

16. Piwowar H. [Live chat panel member]. Response to nsnet. In: Twitter, peer review and altmetrics: the future of research impact assessment [Blog]. London: Guardian News and Media. 21 Sept 2012 [cited 1 Feb 2013]. Available from

17. Scott N. Altmetrics are the central way of measuring communication in the digital age but what do they miss? 2012 Dec 17 [cited 1 Feb 2012]. In: LSE Impact of Social Sciences [Blog]. London: The London School of Economics and Political Science. c2013. – [about 3 screens]. Available from: