SUBSCRIBE
SUBSCRIBE
EXPLORE +
  • About infoDOCKET
  • Academic Libraries on LJ
  • Research on LJ
  • News on LJ
  • Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Libraries
    • Academic Libraries
    • Government Libraries
    • National Libraries
    • Public Libraries
  • Companies (Publishers/Vendors)
    • EBSCO
    • Elsevier
    • Ex Libris
    • Frontiers
    • Gale
    • PLOS
    • Scholastic
  • New Resources
    • Dashboards
    • Data Files
    • Digital Collections
    • Digital Preservation
    • Interactive Tools
    • Maps
    • Other
    • Podcasts
    • Productivity
  • New Research
    • Conference Presentations
    • Journal Articles
    • Lecture
    • New Issue
    • Reports
  • Topics
    • Archives & Special Collections
    • Associations & Organizations
    • Awards
    • Funding
    • Interviews
    • Jobs
    • Management & Leadership
    • News
    • Patrons & Users
    • Preservation
    • Profiles
    • Publishing
    • Roundup
    • Scholarly Communications
      • Open Access

May 7, 2019 by Gary Price

New Report From NASEM “Examines Reproducibility and Replicability in Science, Recommends Ways to Improve Transparency and Rigor in Research”

May 7, 2019 by Gary Price

From the National Academies of Sciences, Engineering, and): Medicine (NASEM):

While computational reproducibility in scientific research is generally expected when the original data and code are available, lack of ability to replicate a previous study — or obtain consistent results looking at the same scientific question but with different data — is more nuanced and occasionally can aid in the process of scientific discovery, says a new congressionally mandated report from the National Academies of Sciences, Engineering, and Medicine.

Reproducibility and Replicability in Science recommends ways that researchers, academic institutions, journals, and funders should help strengthen rigor and transparency in order to improve the reproducibility and replicability of scientific research.

Defining Reproducibility and Replicability

The terms “reproducibility” and “replicability” are often used interchangeably, but the report uses each term to refer to a separate concept.  Reproducibility means obtaining consistent computational results using the same input data, computational steps, methods, code, and conditions of analysis.  Replicability means obtaining consistent results across studies aimed at answering the same scientific question, each of which has obtained its own data.

Reproducing research involves using the original data and code, while replicating research involves new data collection and similar methods used in previous studies, the report says.  Even when a study was rigorously conducted according to best practices, correctly analyzed, and transparently reported, it may fail to be replicated.

“Being able to reproduce the computational results of another researcher starting with the same data and replicating a previous study to test its results facilitate the self-correcting nature of science, and are often cited as hallmarks of good science,” said Harvey Fineberg, president of the Gordon and Betty Moore Foundation and chair of the committee that conducted the study.  “However, factors such as lack of transparency of reporting, lack of appropriate training, and methodological errors can prevent researchers from being able to reproduce or replicate a study.  Research funders, journals, academic institutions, policymakers, and scientists themselves each have a role to play in improving reproducibility and replicability by ensuring that scientists adhere to the highest standards of practice, understand and express the uncertainty inherent in their conclusions, and continue to strengthen the interconnected web of scientific knowledge — the principal driver of progress in the modern world.”

Reproducibility 

The committee’s definition of reproducibility focuses on computation because most scientific and engineering research disciplines use computation as a tool, and the abundance of data and widespread use of computation have transformed many disciplines. However, this revolution is not yet uniformly reflected in how scientists use software and how scientific results are published and shared, the report says. These shortfalls have implications for reproducibility, because scientists who wish to reproduce research may lack the information or training they need to do so.

When results are produced by complex computational processes using large volumes of data, the methods section of a scientific paper is insufficient to convey the necessary information for others to reproduce the results, the report says. Additional information related to data, code, models, and computational analysis is needed.

If sufficient additional information is available and a second researcher follows the methods described by the first researcher, one expects in many cases to obtain the same exact numeric values – or bitwise reproduction. For some research questions, bitwise reproduction may not be attainable and reproducible results could be obtained within an accepted range of variation.

The evidence base to determine the prevalence of non-reproducibility in research is incomplete, and determining the extent of issues related to computational reproducibility across or within fields of science would be a massive undertaking with a low probability of success, the committee found. However, a number of systematic efforts to reproduce computational results across a variety of fields have failed in more than half of attempts made — mainly due to insufficient detail on data, code, and computational workflow.

Replicability 

One important way to confirm or build on previous results is to follow the same methods, obtain new data, and see if the results are consistent with the original. A successful replication does not guarantee that the original scientific results of a study were correct, however, nor does a single failed replication conclusively refute the original claims, the report says.

Non-replicability can arise from a number of sources. The committee classified sources of non-replicability into those that are potentially helpful to gaining knowledge, and those that are unhelpful.

Potentially helpful sources of non-replicability include inherent but uncharacterized uncertainties in the system being studied. These sources of non-replicability are a normal part of the scientific process, due to the intrinsic variation or complexity in nature, the scope of current scientific knowledge, and the limits of current technologies. In such cases, a failure to replicate may lead to the discovery of new phenomena or new insights about variability in the system being studied.

In other cases, the report says, non-replicability is due to shortcomings in the design, conduct, and communication of a study. Whether arising from lack of knowledge, perverse incentives, sloppiness, or bias, these unhelpful sources of non-replicability reduce the efficiency of scientific progress.

Unhelpful sources of non-replicability can be minimized through initiatives and practices aimed at improving research design and methodology through training and mentoring, repeating experiments before publication, rigorous peer review, utilizing tools for checking analysis and results, and better transparency in reporting. Efforts to minimize avoidable and unhelpful sources of non-replicability warrant continued attention, the report says.

Researchers who knowingly use questionable research practices with the intent to deceive are committing misconduct or fraud. It can be difficult in practice to differentiate between honest mistakes and deliberate misconduct, because the underlying action may be the same while the intent is not. Scientific misconduct in the form of misrepresentation and fraud is a continuing concern for all of science, even though it accounts for a very small percentage of published scientific papers, the committee found.

Improving Reproducibility and Replicability in Research

The report recommends a range of steps that stakeholders in the research enterprise should take to improve reproducibility and replicability, including:

  • All researchers should include a clear, specific, and complete description of how the reported results were reached. Reports should include details appropriate for the type of research, such as a clear description of all methods, instruments, materials, procedures, measurements, and other variables involved in the study; a clear description of the analysis of data and decisions for exclusion of some data or inclusion of other; and discussion of the uncertainty of the measurements, results, and inferences.
  • Funding agencies and organizations should consider investing in research and development of open-source, usable tools and infrastructure that support reproducibility for a broad range of studies across different domains in a seamless fashion. Concurrently, investments would be helpful in outreach to inform and train researchers on best practices and how to use these tools.
  • Journals should consider ways to ensure computational reproducibility for publications that make claims based on computations, to the extent ethically and legally possible.
  • The National Science Foundation should take steps to facilitate the transparent sharing and availability of digital artifacts, such as data and code, for NSF-funded studies – including developing a set of criteria for trusted open repositories to be used by the scientific community for objects of the scholarly record, and endorsing or considering the creation of code and data repositories for long-term archiving and preservation of digital artifacts that support claims made in the scholarly record based on NSF-funded research, among other actions.

Confidence in Science

Replicability and reproducibility, useful as they are in building confidence in scientific knowledge, are not the only ways to gain confidence in scientific results.  Research synthesis and meta-analysis, for example, are valuable methods for assessing the reliability and validity of bodies of research, the report says.  A goal of science is to understand the overall effect from a set of scientific studies, not to strictly determine whether any one study has replicated any other.

Among other related recommendations, the report says that people making personal or policy decisions based on scientific evidence should be wary of making a serious decision based on the results, no matter how promising, of a single study. By the same token, they should not take a new, single contrary study as refutation of scientific conclusions supported by multiple lines of previous evidence.

The study — undertaken by the Committee on Reproducibility and Replicability in Science — was sponsored the National Science Foundation and Alfred P. Sloan Foundation.

Direct to Full Text Report

Filed under: Associations and Organizations, Data Files, Funding, Journal Articles, News, Open Access, Preservation, Reports

SHARE:

About Gary Price

Gary Price (gprice@mediasourceinc.com) is a librarian, writer, consultant, and frequent conference speaker based in the Washington D.C. metro area. Before launching INFOdocket, Price and Shirl Kennedy were the founders and senior editors at ResourceShelf and DocuTicker for 10 years. From 2006-2009 he was Director of Online Information Services at Ask.com, and is currently a contributing editor at Search Engine Land.

ADVERTISEMENT

Archives

Job Zone

ADVERTISEMENT

Recent Articles on LJ

DEI Audits: The Whole Picture | Equity

Proud Boys Disrupt Drag Queen Story Time at San Lorenzo Library

Positioned for Power: Hiring an EDI Officer | Equity

From the Top: Library Leaders Talk EDI | Equity

Tour de France: A Watching, Reading, and Listening Guide | Your Home Librarian

ADVERTISEMENT

Related Infodocket Posts

Report: "The Important Role Libraries Play in Building a Creative and Innovative Society"

From ArchDaily: As gateways to knowledge and culture, libraries play a fundamental role in society. Foundational in creating opportunities for learning, as well as supporting literacy and education, the resources ...

Not Real News: An Associated Press Roundup of Untrue Stories Shared Widely on Social Media This Week

From the Associated Press: A roundup of some of the most popular but completely untrue stories and visuals of the week. None of these are legit, even though they were ...

Statement: American Library Association (ALA) Condemns Threats of Violence in Libraries

Full Text of ALA Statement (6/24): In response to the alarming increase in acts of aggression toward library workers and patrons as reported by press across the country, the American ...

Roundup (June 24, 2022)

FCC and IMLS Sign Agreement to Promote Broadband Access More Than Fifty Libraries and Library Systems Live on EBSCO FOLIO Library Services Platform NIST Releases New Guidance and Resources on ...

Report: "Vatican Releases Thousands of Holocaust-Era Letters and Requests Online"

From the Associated Press (via Times of Israel): Pope Francis orders the online publication of 170 volumes of its Jewish files from the recently opened Pope Pius XII archives, the ...

The New York Public Library Opens a ‘Virtual Branch’ on Instagram and Launches a Reading Recommendation Project Using...

From NYPL: The virtual branch— a custom designed interactive AR (Augmented Reality) Effect accessible via Instagram Reels is the centerpiece of #NYPLSummerBookshelf, a new initiative to spark a love of ...

Roundup (June 23, 2022)

CLIR Invites Proposals for Pocket Burgundy Series (via Council on Library and Information Resources) Oregon’s State Library added to National Register of Historic Places (via Oregon Capital Chronicle)

State of New York Releases First-Of-Its Kind Statewide Address-Level Broadband Map

From GCN: An address-level, interactive broadband map will help officials in New York explore statewide high-speed internet availability, assess connectivity needs and better allocate state and federal funding. The map ...

Journal Article: "Rarely Analyzed: The Relationship Between Digital and Physical Rare Books Collections"

The article linked below was recently published by Information Technology and Libraries. Title Rarely Analyzed: The Relationship Between Digital and Physical Rare Books Collections Authors Allison McCormack University of Utah ...

Mellon Foundation Awards $600,000 to Digital Preservation Outreach and Education Network

From The Pratt Institute: The Mellon Foundation has awarded the Pratt Institute School of Information $600,000 to support the Digital Preservation Outreach and Education Network (DPOE-N) in collaboration with the ...

DPLA Receives $150,000 Grant From the Knight Foundation to Expand the Palace Marketplace and Palace Bookshelf

From a DPLA Announcement: DPLA’s ebook work is a key part of our mission to advance digital access to knowledge for all. Earlier this month, The Palace Project app and platform ...

Charles Watkinson Takes Office as AUPresses President

From an AUPresses Announcement: Charles Watkinson, director of the University of Michigan Press, has stepped into the presidency of the Association of University Presses. Watkinson, who also serves as associate ...

ADVERTISEMENT

FOLLOW INFODOCKET ON TWITTER

Tweets by @infodocket

ADVERTISEMENT

This coverage is free for all visitors. Your support makes this possible.

This coverage is free for all visitors. Your support makes this possible.

Primary Sidebar

  • News
  • Reviews+
  • Technology
  • Programs+
  • Design
  • Leadership
  • People
  • COVID-19
  • Advocacy
  • Opinion
  • INFOdocket
  • Job Zone

Reviews+

  • Booklists
  • Prepub Alert
  • Book Pulse
  • Media
  • Readers' Advisory
  • Self-Published Books
  • Review Submissions
  • Review for LJ

Awards

  • Library of the Year
  • Librarian of the Year
  • Movers & Shakers 2022
  • Paralibrarian of the Year
  • Best Small Library
  • Marketer of the Year
  • All Awards Guidelines
  • Community Impact Prize

Resources

  • LJ Index/Star Libraries
  • Research
  • White Papers / Case Studies

Events & PD

  • Online Courses
  • In-Person Events
  • Virtual Events
  • Webcasts
  • About Us
  • Contact Us
  • Advertise
  • Subscribe
  • Media Inquiries
  • Newsletter Sign Up
  • Submit Features/News
  • Data Privacy
  • Terms of Use
  • Terms of Sale
  • FAQs
  • Careers at MSI


© 2022 Library Journal. All rights reserved.


© 2022 Library Journal. All rights reserved.