January 24, 2021

Standards: NISO Releases Draft of White Paper With Results From Phase One of Altmetrics Projects, Open For Public Comments Through July 18th

The National Information Standards Organization (NISO) released a draft of a white paper about their altmetrics project.

More specifically, the white paper takes a look at the results of phase one of NISO’s Alternative Assessment Metrics (Altmetrics) Project that was announced about one year ago (June 20, 2013) with funding from the Sloan Foundation.

Public comment is open through July 18, 2014.

From NISO:

In Phase 1 of the project, three in-person meetings were held and 30 in-person interviews conducted to collect input from all relevant stakeholders, including researchers, librarians, university administrators, scientific research funders, and publishers. The draft white paper is the summary of the findings from those meetings and interviews, along with the identification of potential action items for further work in Phase II of the project.

[Clip]

NISO’s Altmetrics project gathered together the variety of stakeholders in this arena to better understand the issues, obtain their input on what issues could best be addressed with standards or recommended practices, and prioritize the potential actions. This white paper organizes and summarizes the valuable feedback obtained from over 400 participants in the project and identifies a road forward for Phase II of the project.”

“More than 250 ideas were generated by participants in the meetings and interviews,” states Todd Carpenter, NISO Executive Director. “We were able to condense these to 25 action items in nine categories: definitions, research outputs, discovery, research evaluation, data quality and gaming, grouping and aggregation, context, stakeholders’ perspectives, and adoption. The highest priority items focused on unique identifiers for scholarly works and for contributors, standards for usage statistics in the form of views and downloads, and building of infrastructure rather than detailed metrics analysis. We are now soliciting feedback on the draft white paper from the wider community prior to its completion.

The white paper will then be used as the basis for Phase II: the development of one or more of the proposed standards and recommended practices.”

Read the Complete News Release Including Comments From NISO Exec. Director Todd Carpenter and Martin Fenner, Technical Lead Article-Level Metrics for the Public Library of Science (PLOS) and consultant to NISO.

Direct to White Paper (17 pages; PDF)

Action Items

The paper includes 25 potential action items in nine categories.

1. Develop specific definitions for alternative assessment metrics.

2. Agree on proper usage of the term “Altmetrics,” or on using a different term.

3. Define subcategories for alternative assessment metrics, as needed.

4. Identify research output types that are applicable to the use of metrics.

5. Define relationships between different research outputs and develop metrics for this aggregated model.

6. Define appropriate metrics and calculation methodologies for specific output types,such as software, datasets, or performances.

7. Agree on main use cases for alternative assessment metrics and develop a needs assessment based on those use cases.

8. Develop statement about role of alternative assessment metrics in research evaluation.

9. Identify specific scenarios for the use of altmetrics in research evaluation (e.g.,research data, social impact) and what gaps exist in data collection around these scenarios.

10. Promote and facilitate use of persistent identifiers in scholarly communications.

11. Research issues surrounding the reproducibility of metrics across providers.

12. Develop strategies to improve data quality through normalization of source data across providers.

13. Explore creation of standardized APIs or download or exchange formats to facilitate data gathering.

14. Develop strategies to increase trust, e.g., openly available data, audits, or a clearinghouse.

15. Study potential strategies for defining and identifying systematic gaming.

16. Identify best practices for grouping and aggregating multiple data sources.

17. Identify best practices for grouping and aggregation by journal, author, institution, and funder.

18. Define and promote the use of contributorship roles.

19. Establish a context and normalization strategy over time, by discipline, country, etc.

20. Describe how the main use cases apply to and are valuable to the different stakeholder groups.

21. Identify best practices for identifying contributor categories (e.g., scholars vs.general public).

22. Identify organizations to include in further discussions.

23. Identify existing standards that need to be applied in the context of further discussions.

24. Identify and prioritize further activities.

25. Clarify researcher strategy (e.g., driven by researcher uptake vs. mandates by funders and institutions).

Additional Resources

Direct to White Paper (17 pages; PDF)

About Gary Price

Gary Price (gprice@mediasourceinc.com) is a librarian, writer, consultant, and frequent conference speaker based in the Washington D.C. metro area. Before launching INFOdocket, Price and Shirl Kennedy were the founders and senior editors at ResourceShelf and DocuTicker for 10 years. From 2006-2009 he was Director of Online Information Services at Ask.com, and is currently a contributing editor at Search Engine Land.

Share