Journal Article: “Ten Simple Rules For Recognizing Data and Software Contributions In Hiring, Promotion, And Tenure”
The article (full text) linked below was recently published by PLOS Computational Biology.
Title
Ten Simple Rules For Recognizing Data and Software Contributions In Hiring, Promotion, And Tenure
Authors
Iratxe Puebla
DataCite
Giorgio A. Ascoli
George Mason University
Jeffrey Blume
University of Virginia
John Chodacki
California Digital Library, University of California
Joshua Finnell
Colgate University
David N. Kennedy
University of Massachusetts Chan Medical School
Bernard Mair
Association of Public and Land-grant Universities
Maryann E. Martone
University of California, San Diego
Jamie Wittenberg
University of Colorado Boulder
Jean-Baptiste Poline
McGill University
Source
PLOS Computational Biology
20(8): e1012296
DOI: 10.1371/journal.pcbi.1012296
Introduction
Changes in science practices are often perceived to be slow. It took about 10 years from the Collins and Tabak editorial on scientific reproducibility in 2014 to see data management mandates implemented by US funding agencies . However, open science practices have seen a sharp increase in adoption over the last few years, supported by policy (for example, those by the European Commission or the 2022 White House Office of Science and Technology Policy (OSTP) memo) as well as new generations of digital tools and scientists who are embedding open values in their research practices. In this faster-paced open science environment, universities are key to fostering adoption among researchers. Universities drive implementation by advancing best practices and accounting for the needs and norms of diverse departments and disciplines. Universities are positioned to catalyze adoption of open practices through their academic evaluation processes, particularly, recruitment, tenure, and promotion. The capacity of researchers and instructors to engage with data and software scholarship will shape the next generation of students and scientists, and universities will play a crucial role in nurturing those skills by rewarding such contributions and expertise among their faculty.
The ways in which promotion and tenure committees operate vary significantly across universities and departments. While committees often have the capability to evaluate the rigor and quality of articles and monographs in their scientific field, assessment with respect to practices concerning research data and software is a recent development and one that can be harder to implement, as there are few guidelines to facilitate the process. More specifically, the guidelines given to tenure and promotion committees often reference data and software in general terms, with some notable exceptions such as guidelines in [5] and are almost systematically trumped by other factors such as the number and perceived impact of journal publications. The core issue is that many colleges establish a scholarship versus service dichotomy: Peer-reviewed articles or monographs published by university presses are considered scholarship, while community service, teaching, and other categories are given less weight in the evaluation process. This dichotomy unfairly disadvantages digital scholarship and community-based scholarship, including data and software contributions [6]. In addition, there is a lack of resources for faculties to facilitate the inclusion of responsible data and software metrics into evaluation processes or to assess faculty’s expertise and competencies to create, manage, and use data and software as research objects. As a result, the outcome of the assessment by the tenure and promotion committee is as dependent on the guidelines provided as on the committee members’ background and proficiency in the data and software domains.
The presented guidelines aim to help alleviate these issues and align the academic evaluation processes to the principles of open science. We focus here on hiring, tenure, and promotion processes, but the same principles apply to other areas of academic evaluation at institutions. While these guidelines are by no means sufficient for handling the complexity of a multidimensional process that involves balancing a large set of nuanced and diverse information, we hope that they will support an increasing adoption of processes that recognize data and software as key research contributions.
Direct to Full Text Article
Filed under: Data Files, Digital Collections, Funding, Interactive Tools, Libraries, Management and Leadership, News, Open Access, PLOS, Reports
About Gary Price
Gary Price (gprice@gmail.com) is a librarian, writer, consultant, and frequent conference speaker based in the Washington D.C. metro area. He earned his MLIS degree from Wayne State University in Detroit. Price has won several awards including the SLA Innovations in Technology Award and Alumnus of the Year from the Wayne St. University Library and Information Science Program. From 2006-2009 he was Director of Online Information Services at Ask.com.