The following article was recently published by PLOS ONE.
Title
Author
Lutz Bornmann
Max Planck Society
Robin Haunschild
Max Planck Institute for Solid State Research
Source
PLoS ONE 13(5): e0197133
DOI: 10.1371/journal.pone.0197133
Abstract
In this study, we address the question whether (and to what extent, respectively) altmetrics are related to the scientific quality of papers (as measured by peer assessments). Only a few studies have previously investigated the relationship between altmetrics and assessments by peers. In the first step, we analyse the underlying dimensions of measurement for traditional metrics (citation counts) and altmetrics–by using principal component analysis (PCA) and factor analysis (FA). In the second step, we test the relationship between the dimensions and quality of papers (as measured by the post-publication peer-review system of F1000Prime assessments)–using regression analysis. The results of the PCA and FA show that altmetrics operate along different dimensions, whereas Mendeley counts are related to citation counts, and tweets form a separate dimension. The results of the regression analysis indicate that citation-based metrics and readership counts are significantly more related to quality, than tweets. This result on the one hand questions the use of Twitter counts for research evaluation purposes and on the other hand indicates potential use of Mendeley reader counts.
Direct to Full Text Article
Direct to Complete Dataset (excluding DOIs)