The ASIS&T Bulletin special section on altmetrics presents alternative metrics as a new and critically needed approach to measuring the impact of scholarly research. With long-established citation-based metrics unable to capture the increasing variety of online references to a scholar’s work, alternative indicators offer a different view of the influence of that work. Contributed papers demonstrate how altmetrics can work on a personal level to enhance a scholar’s CV and on a broad, even global level, to transform scholarly communication through its interaction with open access, digital repositories and research in emerging countries. One article suggests altmetrics should soon be included among mainstream metrics, and other contributions describe specific indicators and altmetric software considerations. The need for innovative measurement and the advantages of altmetrics in particular bode well for their wide acceptance and continuing development.

altmetrics
measurement
indicators
impact of scholarly output
citation impact
scholarly publishing

Bulletin, April/May 2013


Introduction

Altmetrics: What, Why and Where?

by Heather Piwowar, Guest Editor

Altmetrics is a hot buzzword. What does it mean? What's behind the buzz? What are the risks and benefits of using alternative metrics of research impact – altmetrics – in our discovery and evaluation systems? How are altmetrics being used now, and where is the field going?

This special section of the Bulletin of the Association for Information Science and Technology focuses on these questions. Essays from seven perspectives highlight the role of altmetrics in a wide variety of settings. 

The collection begins with its most general article, one I authored with my ImpactStory co-founder Jason Priem, motivating the role of altmetrics for individual scholars through "The Power of Altmetrics on a CV." 

The next few papers highlight ways that altmetrics may transform scholarly communication itself. Ross Mounce, a doctoral student and Panton Fellow of the Open Knowledge Foundation, explores the relationship between open access and altmetrics in "OA and Altmetrics: Distinct but Complementary." Juan Pablo Alperin, doctoral student and developer with the Public Knowledge Project, encourages us to "Ask Not What Altmetrics Can Do for You, but What Altmetrics Can Do for Developing Countries." Stacy Konkiel and Dave Scherer, librarians at Indiana University and Purdue, respectively, discuss how almetrics can empower institutional repositories in "New Opportunities for Repositories in the Age of Altmetrics."

Completing the collection are three more perspectives from the builders of hot altmetrics tools. Jennifer Lin and Martin Fenner, both of PLOS, explore patterns in altmetrics data in "The Many Faces of Article-level Metrics." Jean Liu, blogger, and Euan Adie, founder of Altmetric.com, consider "Five Challenges in Altmetrics: A Toolmaker's Perspective." Finally, Mike Buschman and Andrea Michalek, founders of Plum Analytics, wrap up the collection asking, "Are Alternative Metrics Still Alternative?"

Before you dive in, if you are new to altmetrics, let me give you a quick informal introduction. For decades, the most common metric for evaluating research impact has been the number of times a research article is cited by other articles. This metric is sometimes represented by the raw count of citations received by the specific article in question or sometimes through an impact-by-association proxy – the number of citations received by the journal that published the article, summarized using a formula called the journal impact factor

Citations are not the only way to represent the impact of a research article. A few alternative indicators have been the subjects of webometrics and bibliometrics research for years, including download counts and mentions in patents. However, as scholarly communication moves increasingly online, more indicators have become available: how many times an article has been bookmarked, blogged about, cited in Wikipedia and so on. These metrics can be considered altmetrics – alternative metrics of impact. (Appropriately enough, the term altmetrics was first proposed in a tweet [https:/twitter.com/asnpriem/status/25844968813].) 

We might even consider nontraditional applications of citation metrics to be altmetrics – citations to datasets as first-class research objects, for example. Other examples include citation counts filtered by type of citation, like citations by editorials or citations only from review articles or citations made only in the context of experimental replication. All of these are alternative indicators of impact.

Altmetrics offer four potential advantages:

  • A more nuanced understanding of impact, showing us which scholarly products are read, discussed, saved and recommended as well as cited.
  • Often more timely data, showing evidence of impact in days instead of years.
  • A window on the impact of web-native scholarly products like datasets, software, blog posts, videos and more.
  • Indications of impacts on diverse audiences including scholars but also practitioners, clinicians, educators and the general public.

Of course, these indicators may not be “alternative” for long. At that point, hopefully we’ll all just call them metrics.

Dive in, read all about it and let us know what you think. Continued conversation, background information and crowdsourced lists of new research and resources can be found on twitter using the hashtag #altmetrics (https://twitter.com/search/realtime?q=%23altmetrics), in the altmetrics Mendeley group (www.mendeley.com/groups/586171/altmetrics/papers/) and probably at a conference near you.

Thanks very much to all authors in this collection for voluntarily making their articles openly available for reuse under a Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/3.0/). 

Happy reading!
 


Heather Piwowar is a postdoc at Duke University, studying the adoption and use of open research data. She is also a co-founder of ImpactStory (http://impactstory.org/), an open-source web tool that helps scholars track and report the broader impacts of their research. @researchremix.