At our university, we are working on a transition to Open Science. Evaluating the work of our colleagues is a crucial part of this transition, making it an important component of Open Science. The main aim is to strongly encourage our staff to engage in a different approach to doing research. One that engages societal stakeholders and their issues, on a regional and international level. We believe that this is key to the practice of good science. This means that our researchers should engage with society to define problems to be solved, they should cooperate with societal actors to solve these problems, and share the results, data and code of our projects whenever possible. To properly incentivize and reward our staff for these different activities we changed our internal evaluation system to comply with this new idea of how to do science. In practice, this means that we use a pluriformity of quality indicators: moving away from the Journal Impact Factor or h-index, but towards peer review on narratives about the work that has been done. Most importantly, we aim for diversity and inclusiveness of quality indicators with Open Science practices particularly rewarded. Not only have we been working on a new evaluation system for researchers, but also for research units. On a national level a new protocol has been developed: The Strategic Evaluation Protocol (SEP). This SEP has been signed by the major institutions of science in the country, including the Royal Society, the Federation of Universities and the main government science funder NWO. In SEP, a research unit is asked to outline its vision, strategy and aims of the research upfront. Narratives (supported by data, which is DORA compliant) are used to assess whether these visions and aims are met, and most importantly again, units are free to choose the indicators that they believe to appropriately capture their practices and the results of their work. Next to quality and viability, four aspects, regarding how the research by the units is done are taken into account: Open Science practices and efforts, PhD policy and training, Academic Culture (openness, safety, inclusiveness, and Research Integrity), and Human Resource Policy (Diversity and Talent Management). This system will be used on a national level starting from 2021;
Part of it came top-down, with the Dutch government announcing that “Open Science will be the standard” in the future of Dutch research. However, it was mainly triggered from the bottom up. (Medical) Science is growing enormously which has created fierce international competition and strong growth in the number of publications. Studies from the early 2010s, published in the most prestigious journals, have shown that as a result of this growth, science suffers from a major reproducibility crisis. Also, a lot of science being done is irrelevant for clinical practice, with billions of dollars/euros being wasted every year globally. For the better part of the past decade renowned scientists have been sounding the alarm about the current state of science. They call for a transition towards more responsible research that aims for quality for both science for science and science for society. Even more particularly to our concern, we have witnessed science turning into a metric based system in which the interests and needs of society have largely been pushed out of the picture. Production of journal articles in an (unhealthy) short period time span is the only thing that counts, this goes at the expense of often more applied research that is urgent and actually beneficial to society. For researchers, this increasingly means that they feel the way they are judged as wrong and unfair. This makes them angry, and eventually might lead them to cut corners. A more diverse set of indicators, aligning with people’s intrinsic motivation, takes away these perverse incentives and thus at the systems level fosters research integrity. Of course, this requires academic leadership at many levels in the university.
There are concerns that were voiced when we started the process. There have been worries, especially from younger researchers, that changing the system would lead to competitive disadvantages in terms of their research and career prospects. Compared to researchers in funding and grant programmes coming from other institutions – still adhering to the metric-based value system. A completely legitimate concern. However, there is a noticeable shift in ideology taking place on a national, European and even global level. A shift that might level the playing field considerably. In Utrecht, we have been frontrunners, but we are confident that others will follow suit. The new evaluation protocol triggered another response, namely that this new and pluriform approach no longer allows us to compare research output across different disciplines anymore. If all fields of research are to be assessed based on their own strategy - and different indicators – there is no longer an objective means of comparison. The response to this concern is to point out the fact that the goals of these different groups are dissimilar by nature, and change over time. Hence, there is no actual need to compare these groups to each other. Rather, they should be judged them on their own merits. We have to leave the idea that we are in a competition. In addition, we noticed that the Open Science practices, such as the FAIR-data principles, need leadership and concrete actions to support these activities. People need support to get them over the first hurdles in this transition to Open Science. Even more so, they need to get the appropriate credit for what they do. As much of this work is done by non-academic staff, we need to acknowledge their role in this process. This might require the installment of new kinds of positions in the organization. The new set of evaluation criteria gives people the opportunity to change the profile of their staff’s work. We are with five people in the UU Open Science Program Team. The coming years we will focus on actions, assisting researchers and policy makers in this transition and making sure to get the message across to the entire organization with its seven faculties. These faculties, when they actively start the transition to Open Science with a dedicated team, will encounter the issue of what the proposed initiatives mean within their own context. They know that things have to be done differently, but now it will get very concrete for them, and they will have to lever between the costs and benefits of the proposed changes. We consider ourselves as supportive to this new program. We help people and explain them how they can make the changes in their daily habits.
Yes, at UMC Utrecht we do, but it took at least five years. Importantly, we witness a positive change in the level of diversity within our organization. For instance, at the level of professorships, we now see that the appointment for professors has seen a great increase in diversity. Especially traditionally not highly rewarded fields that work closer to clinical practice and involve less molecular and biochemical research, such as social medicine, ethics, medical history, nursing science, geriatrics, rehabilitation sciences or medical education, have recently recruited several new professors, many of them women. These appointments would previously not have been possible. In addition, we see that the thoughts of our programme are now being adopted on a national, European and even global level. More and more people take up the message that we aim to spread. Importantly, we see movement on the side of the funders. They are really modifying their funding instruments to be more adaptive to the changing landscape of science. They allow for group science, acknowledging the role of a diverse set of actors. We are really proud to have been on the forefront of this movement.
¨We have to get out of the idea that we are in a competition¨