NISO Alternative Assessment Metrics (Altmetrics) Initiative

In June 2013, the Alfred P. Sloan Foundation awarded NISO a grant to undertake a two-phase initiative to explore, identify, and advance standards and/or best practices related to a new suite of potential metrics in the community. This initiative was a direct outgrowth of a breakout discussion group during the altmetrics 12 meeting in Chicago, IL. This project, which was accomplished in two phases, is seen as an important step in the development and adoption of new assessment metrics, which include usage-based metrics, social media references, and network behavioral analysis. The NISO Altmetrics Initiative also explored potential assessment criteria for non-traditional research outputs, such as data sets, visualizations, software, and other applications. The first phase, which took place from 2013-2014, exposed areas for potential standardization and the community collectively prioritized those potential projects. The second phase, which took place from 2014-2016, advanced work in several areas and developed those into recommended practices prioritized by the community and approved by the membership.

Phase 2 Projects

NISO Voting Members reviewed and approved a proposal to develop several standards or recommended practices during Phase 2 of the Altmetrics Initiative. Areas/topics to be addressed are:

  • Development of specific definitions for alternative assessment metrics – This working group will come up with specific definitions for the terms commonly used in alternative assessment metrics, enabling different stakeholders to talk about the same thing. This work will also lay the groundwork for the other working groups.

  • Definitions for appropriate metrics and calculation methodologies for specific output types – Research outputs that are currently underrepresented in research evaluation will be the focus of this working group. This includes research data, software, and performances, but also research outputs commonly found in the social sciences. The working group will come up with recommendations for appropriate metrics for these research outputs and will develop guidelines for their use.

  • Development of strategies to improve data quality through source data providers – Data quality is essential before any alternative assessment metrics can be used for research evaluation. This working group will look at issues of data quality and will recommend strategies to overcome these issues, or to clarify the limitations of particular assessment metrics.

  • Promotion and facilitation of use of persistent identifiers in scholarly communications – Persistent identifiers are needed to clearly identify research outputs for which collection of metrics is desired, but also to describe their relationships to other research outputs, to contributors, institutions, and funders. This working group will work closely with other initiatives in the space of identifiers.

  • Descriptions of how the main use cases apply to and are valuable to the different stakeholder groups – Alternative assessment metrics can be used for a variety of use cases from research evaluation to discovery. This working group will try to identify the main use cases, the stakeholder groups to which they are most relevant, and will also develop a statement about the role of alternative assessment metrics in research evaluation.

Documents

Committee Roster

Steering Group

Altmetrics WG A: Definitions & Use Cases, Co-Chairs

Altmetrics WG A: Definitions & Use Cases, Members

Rodrigo Costas

Centre for Science and Technology Studies, University of Leiden

Beth Martin

University of North Carolina Charlotte, J. Murrey Atkins Library

Altmetrics WG B: Output Types & Identifiers, Co-chairs

Altmetrics WG B: Output Types & Identifiers, Members

Joy Painter

California Institute of Technology (Caltech) Library

Altmetrics WG C: Data Quality, Co-chairs

Altmetrics WG C: Data Quality, Members

Tilla Edmunds

Director, Web of Science Content Management
Clarivate Analytics

Angelia Ormiston

Johns Hopkins University Press/Sheridan Libraries of Johns Hopkins University

Zohreh Zahedi

Centre for Science and Technology Studies, University of Leiden