March 30, 2020

Research Article: “Evaluating Institutional Open Access Performance: Methodology, Challenges and Assessment” (Preprint)

The following research article (preprint) was posted earlier today on bioRxiv.

Title

Evaluating Institutional Open Access Performance: Methodology, Challenges and Assessment

Authors

Chun-Kai Huang
Curtin University 

Cameron Neylon
Curtin University 

Richard Hosking
Curtin University 

Lucy Montgomery
Curtin University 

Katie Wilson
Curtin University 

Alkim Ozaygen
Curtin University 

Chloe Brookes-Kenworthy
Curtin University 

Source

via bioRxiv
DOI 10.1101/2020.03.19.998336

Abstract

Open Access to research outputs is becoming rapidly more important to the global research community and society. Changes are driven by funder mandates, institutional policy, grass-roots advocacy and culture change. It has been challenging to provide a robust, transparent and updateable analysis of progress towards open access that can inform these interventions, particularly at the institutional level. Here we propose a minimum reporting standard and present a large-scale analysis of open access progress across 1,207 institutions world-wide that shows substantial progress being made. The analysis detects responses that coincide with policy and funding interventions. Among the striking results are the high performance of Latin American and African universities, particularly for gold open access, whereas overall open access levels in Europe and North America are driven by repository-mediated access. We present a top-100 of global universities with the world’s leading institutions achieving around 80% open access for 2017 publications.

Direct to Full Text Article (Preprint)
40 pages; PDF.

Also Posted Today : A Second Preprint by the Same Group of Authors: “Evaluating Institutional Open Access Performance: Sensitivity Analysis”

In the article “Evaluating institutional open access performance: Methodology, challenges and assessment” we develop the first comprehensive and reproducible workflow that integrates multiple bibliographic data sources for evaluating institutional open access (OA) performance. The major data sources include Web of Science, Scopus, Microsoft Academic, and Unpaywall. However, each of these databases continues to update, both actively and retrospectively. This implies the results produced by the proposed process are potentially sensitive to both the choice of data source and the versions of them used. In addition, there remain the issue relating to selection bias in sample size and margin of error. The current work shows that the levels of sensitivity relating to the above issues can be significant at the institutional level. Hence, the transparency and clear documentation of the choices made on data sources (and their versions) and cut-off boundaries are vital for reproducibility and verifiability.

Gary Price About Gary Price

Gary Price (gprice@mediasourceinc.com) is a librarian, writer, consultant, and frequent conference speaker based in the Washington D.C. metro area. Before launching INFOdocket, Price and Shirl Kennedy were the founders and senior editors at ResourceShelf and DocuTicker for 10 years. From 2006-2009 he was Director of Online Information Services at Ask.com, and is currently a contributing editor at Search Engine Land.

Share