‘Knowledge production must fundamentally change’
‘Free-market economics has reduced the value of higher education to a question of efficiency and productivity,’ says Sarah de Rijcke. And, she adds, there is no clear description of what we actually want scientific research to achieve. Inaugural speech on Friday 17 May.
Sarah de Rijcke researches how the value of scientific research is evaluated. As Professor of Science and Evaluation Studies at the Centre for Science and Technology Studies (CWTS), she looks at the formal evaluation system, the informal evaluation criteria and higher-education policy. She investigates how terms such as ‘excellent’ and ‘relevant’ are fleshed out, as well as at effects of this on scholarship itself. Moreover, she asks whether this is the right way to determine the value of scholarship, a question that she answers with an emphatic ‘no’ in her inaugural lecture.
How much cash did they bring in?
An important conclusion that De Rijcke draws is that with evaluations, the dominant free-market economics goes too far in reducing the question about the value of knowledge to one of efficiency and productivity. How much did a researcher publish? What was the impact factor of the journals that it was published in? How many citations did their work have? How much money did they bring in? De Rijcke: ‘These common indicators are a bit like the light of a streetlight: they only shed light on a small area, which is a shame. We have to consider what such a limited focus means in the longer term for higher education as a critical part of society. And whether universities aren’t becoming unattractive employers for people who want to do more for the world.’
Research evaluation has to change
In her inaugural lecture, De Rijcke also responds to KNAW, NWO and ZonMw having signed the San Francisco Declaration on Research Assessment (DORA). DORA is a global initiative that aims to reduce the reliance on indicators such as publications and citations to evaluate research and researchers. This is a very important sign that research evaluation needs to be improved, she says.
Removing indicators is not enough
Another important conclusion that De Rijcke draws it that removing certain indicators – as is the intention of DORA – is not enough. She thinks that more must happen before researchers see a fundamental effect. ‘This is because, for instance, indicator-thinking has become intrinsically linked with how research is organised. This involves all sorts of important decisions. And certain norms and values that are linked to indicators are becoming increasingly integrated into knowledge production. Research is designed and adjusted to ensure that it attains a high score.’
Fundamental reform
Another factor is at play too. When the now-popular indicators were introduced, they tied in seamlessly with deep-rooted quality-evaluation methods and further reinforced these. De Rijcke wonders whether the DORA signatories are ready for a much trickier challenge: the fundamental reform of how academic knowledge is generated. ‘Open science could be an important tool, as could smart quantitative information,’ she says. ‘But if you ask me what we need first is a clear picture of what we want to achieve with our universities and research. Only then is it worth thinking about suitable forms of evaluation.’
Text: Steven Hagers
Mail the editors