“How journal rankings can suppress interdisciplinarity. The case of innovation studies in business and management” (via arXiv)
by: Ismael Rafols, Loet Leydesdorff, Alice O’Hare, Paul Nightingale, Andy Stirling
From the Abstract:
While interdisciplinary research (IDR) is valued as a means of encouraging scientific breakthroughs, innovation and socially relevant science, it is also widely perceived as being at a disadvantage in research evaluations. Various qualitative studies have provided evidence that peer review tends to be biased against IDR. However, quantitative investigations on this claim have been few and inconclusive. This paper provides new quantitative evidence of how journal rankings can disadvantage IDR in evaluation. Using publication data it compares the degree of interdisciplinarity and the research performance of innovation studies units with business and management schools in the UK. The paper uses various mappings and metrics to show that innovation studies units are consistently more interdisciplinary than business and management schools. It shows that the top journals in the Association of Business Schools’ rankings span a less diverse set of disciplines than lower ranked journals, and that this bias results in a more favourable performance assessment of the business and management schools, which are more disciplinary-focused. Lastly, it demonstrates how a citation-based analysis (generally agreed to be more accurate) challenges the ranking-based assessment. In summary, the investigation illustrates how allegedly ‘excellence-based’ journal rankings have a bias in favour of mono-disciplinary research. Given the high correlation between journal-ranks and ratings in UK’s research assessment, this bias is likely to negatively affect the valuation of interdisciplinary organisations and encourage publication patterns to be more compliant with disciplinary authority. We discuss the general implications that the mechanism of IDR supression illuminated here may hold.