How Research Credibility Suffers in a Quantified Society (socialsciencespace.com) 23
An anonymous reader shares a report: Academia is in a credibility crisis. A record-breaking 10,000 scientific papers were retracted in 2023 because of scientific misconduct, and academic journals are overwhelmed by AI-generated images, data, and texts. To understand the roots of this problem, we must look at the role of metrics in evaluating the academic performance of individuals and institutions.
To gauge research quality, we count papers, citations, and calculate impact factors. The higher the scores, the better. Academic performance is often expressed in numbers. Why? Quantification reduces complexity, makes academia manageable, allows easy comparisons among scholars and institutions, and provides administrators with a feeling of grip on reality. Besides, numbers seem objective and fair, which is why we use them to allocate status, tenure, attention, and funding to those who score well on these indicators.
The result of this? Quantity is often valued over quality. In The Quantified Society I coin the term "indicatorism": a blind focus on enhancing indicators in spreadsheets, while losing sight of what really matters. It seems we're sometimes busier with "scoring" and "producing" than with "understanding." As a result, some started gaming the system. The rector of one of the world's oldest universities, for one, set up citation cartels to boost his citation scores, while others reportedly buy(!) bogus citations. Even top-ranked institutions seem to play the indicator game by submitting false data to improve their position on university rankings!
To gauge research quality, we count papers, citations, and calculate impact factors. The higher the scores, the better. Academic performance is often expressed in numbers. Why? Quantification reduces complexity, makes academia manageable, allows easy comparisons among scholars and institutions, and provides administrators with a feeling of grip on reality. Besides, numbers seem objective and fair, which is why we use them to allocate status, tenure, attention, and funding to those who score well on these indicators.
The result of this? Quantity is often valued over quality. In The Quantified Society I coin the term "indicatorism": a blind focus on enhancing indicators in spreadsheets, while losing sight of what really matters. It seems we're sometimes busier with "scoring" and "producing" than with "understanding." As a result, some started gaming the system. The rector of one of the world's oldest universities, for one, set up citation cartels to boost his citation scores, while others reportedly buy(!) bogus citations. Even top-ranked institutions seem to play the indicator game by submitting false data to improve their position on university rankings!