Introduction

Why open science is important

Science is a cumulative process (Merton 1973) that relies on previous knowledge considering all types of research outputs (Dasgupta and David 1994; Walsh, Cohen, and Cho 2007). Although sharing research outputs as common goods should be the common rule, this is actually not the case.

The Open Science (OS) movement was forged in response to this concern. It refers to a range of activities (Grattarola et al. 2024) including sharing research outputs. OS enables replication, improves productivity, limits redundancy, and helps create more robust research methods and a rich network of resources, thus increasing research efficiency (Murray and O’Mahony 2007; Walsh, Cohen, and Cho 2007; Shibayama and Baba 2011). In the end, it contributes to the collective building of scientific knowledge and societal progress (Cole et al. 2024).

How modern science is recognised

To appreciate any contribution to science, credit and recognition are a prerequisite to any ‘reward mechanism’ and need to be mapped in the overall research assessment scheme. Crediting is the explicit recognition for one’s contribution to a work, the process whereby ownership of a scientific work is attributed to an individual, a group of individuals or an institution (Merton 1973; Shibayama and Baba 2011; Walsh, Cohen, and Cho 2007). It is the first step in recognising the value of one’s work and is generally quantified by a series of metrics. It is an important process which builds scientists’ reputation. Crediting can be seen as a milestone in the process of rewarding which encompasses several elements such as academic promotion, grants, dedicated staff and support materials that help produce subsequent discoveries (Latour and Woolgar 1979; Shibayama and Lawson 2021). In the case of published discoveries, credit is allocated by the community through attribution, peer review, and citation. It can also come from patenting in some specific cases (ALLEA 2023). Sharing intermediate or pre-publication outputs is however far less established as it is more complex and does not necessarily fit into the conventional crediting system of science (Shibayama and Lawson 2021). A number of studies have underlined that academic research fails to recognise, value, and reward efforts to open up the scientific process (Hicks et al. 2015; Munafò et al. 2017; Wilsdon et al. 2015; Wouters et al. 2015). Yet it is crucial that these activities be valued as they require considerable time, energy and expertise to make outputs findable and accessible and for data and software to be compliant with international standards making them interoperable and reusable by others, as stated by the FAIR principles (Wilkinson et al. 2016).

The current sharing practice of academics

In the ‘publish or perish’ culture, some outputs (such as data, databases, or algorithms) may provide academics with an advantage under high competition which can lead them not to share those (Merton 1973; Dasgupta and David 1994; Haas and Park 2010; Haeussler et al. 2014). Moreover, some commercialisation contexts, regulatory constraints, privacy issues or data reuse concerns as well as shortage of funds, lack of time or of capacities and technical resources, could also be barriers (Haas and Park 2010; Walsh, Cohen, and Cho 2007). As a result, the amount of outputs shared through open mechanisms is still limited in many communities or disciplines, and a lot of resources are shared in one-to-one transactions (Shibayama and Baba 2011; Tenopir et al. 2015; Wallis, Rolando, and Borgman 2013). Thus, the degree of openness is still mainly at the discretion of individual academics (Blume 1974; Hackett 2008; Nelson 2016). However, academics broadly agree that open sharing is beneficial to science and numerous studies showed that when requested, it is respected (Czarnitzki, Grimpe, and Pellens 2015; Haas and Park 2010; Shibayama and Baba 2011; Walsh, Cohen, and Cho 2007). Now a clear consensus on how outputs should be shared and rewarded, needs to be established.

The current normative incentives for sharing

Since the Budapest Declaration (BOAI 2002), that specifically propelled the Open Access (OA) concept, governments, funders, research organisations and publishers are increasingly adopting formal OS policies (Manco 2022) with a primary focus on OA to publications and underlying research data. Pioneers in adopting such policies include, for example, Scientific Electronic Library Online (SciELO), the National Institutes of Health (NIH), White House’s Office of Science and Technology Policy (OSTP), Gates Foundation, Wellcome Trust, UK research councils, Harvard University and Queensland University of Technology. In spite of this, the sharing activities are often not sufficiently recognised or credited in formal assessments of researchers and project proposals, discouraging researchers from engaging in sharing activities (Arthur et al. 2021).

Efforts to address this challenge have led to the rise of several initiatives within the Responsible Research Assessment (RRA) movement. Notable examples are the DORA declaration (DORA 2012), the Dutch initiative ‘Science in Transition’ (Dijstelbloem et al. 2013), the Leiden Manifesto (Hicks et al. 2015) and the Metric Tide (Wilsdon et al. 2015). While these initiatives have not directly focused on recognising and rewarding OS practices, they have significantly contributed to promoting responsible metrics and the assessment of a broad spectrum of research outputs, including datasets and software.

The most notable initiatives that explicitly incorporated OS into the RRA discourse have emerged in the European Union. For example, the European Commission established a Working Group in 2016 on rewards under OS that formulated the OS Career Assessment Matrix (OS-CAM), suggesting various criteria for incorporating OS activities in the formal evaluations of researchers at all career stages (European Commission, Directorate-General for Research and Innovation et al. 2017). Moreover, the European Research Area Policy Agenda for 2022-2024 (EC DGRI 2021) has set the transformation of research assessment systems as a priority strategic action, including the rewarding of OS practices as part of this necessary change, further supported by the Conclusions of the Council of the European Union on Research Assessment and Implementation of OS (EU Council 2022). In line with this agenda, the Coalition for Advancing Research Assessment (CoARA 2022) was formed in 2022, as an initiative from several European organisations, bringing together stakeholders from the research ecosystem across 164 countries to enhance and harmonise research assessment practices, with an emphasis on recognising and rewarding behaviours underpinning OS activities. Additionally, the Horizon Europe programme has incorporated OS into its evaluation of all research proposals and project assessments, showcasing a prime example of how these practices can be embedded in funder evaluation schemes (EU AGA 2023; EU Parliament and Council 2021). Also, the ongoing Horizon Europe ‘GraspOS’ project (EU Horizon RIA GraspOS project 2023) has been specifically designed to support the emerging policy reforms and bring about the adoption of a RRA system that embeds OS practices, by developing federated infrastructure and tools. Lastly, cOAlition S funders, including the European Commission, have recently introduced a proposal named ‘Towards Responsible Publishing’ (cOAlition S 2023), which calls for the incorporation of OS practices into funders’ assessment policies and the elimination of journal metrics in the evaluation of researchers.

Some countries have also initiated steps to integrate OS practices into their research assessment schemes, with notable efforts seen in the Netherlands (VNSU et al. 2019), France (CNRS 2019), Norway (UHR Working Group 2021), Finland (Working group for responsible evaluation of a researcher 2020) and in Latin America and the Caribbean (CLACSO 2019). Details and more initiatives are given by Rijcke et al. (2023). Simultaneously, bottom-up international initiatives have emerged to explore more immediate ways to assess and credit OS activities. Notably, the data science community is getting organised under the umbrella of the Research Data Alliance (RDA) to articulate related concerns and offer recommendations (e.g., RDA-SHARC Interest Group, RDA-Research Evaluation Interest Group, and CODATA working groups).

Objective

In this paper, we provide a set of recommendations developed by the RDA-SHARC interest group to help implement various rewarding schemes for opening up science. These recommendations specifically emphasise the need to include sharing activities in research evaluation schemes as an overarching, valuable, and hopefully efficient mechanism to promote OS practices. The recommendations target a broad range of stakeholders in research and innovation systems, as highlighted by the UNESCO Recommendation on OS (UNESCO 2021), emphasising the collaborative effort of individual researchers, research institutions, funders, government policymakers and publishers in transforming the research culture towards OS (Nosek et al. 2023).

References

ALLEA. 2023. “The European Code of Conduct for Research Integrity -Revised Edition 2023.” Berlin. https://allea.org/code-of-conduct/.
Arthur, Paul Longley, Lydia Hearn, Lucy Montgomery, Hugh Craig, Alyssa Arbuckle, and Ray Siemens. 2021. “Open Scholarship in Australia: A Review of Needs, Barriers, and Opportunities.” Digital Scholarship in the Humanities 36 (4): 795–812. https://doi.org/10.1093/llc/fqaa063.
Blume, Stuart S. 1974. Toward a Political Sociology of Science. New York, Free Press.
BOAI. 2002. “Budapest Open Access Initiative.” 2002. https://www.budapestopenaccessinitiative.org/read/.
CLACSO. 2019. “FOLEC.” 2019. https://www.clacso.org/folec/.
CNRS. 2019. “CNRS Roadmap for Open Science.” https://www.science-ouverte.cnrs.fr/wp-content/uploads/2019/11/CNRS_Roadmap_Open_Science_18nov2019.pdf.
cOAlition S. 2023. “Plan s. Towards Responsible Publishing.” 2023. https://www.coalition-s.org/towards-responsible-publishing/.
CoARA. 2022. “Coalition for Advancing Research Assessment.” 2022. https://coara.eu/.
Cole, Nicki Lisa, Thomas Klebel, Simon Apartis, and Tony Ross-Hellauer. 2024. “The Societal Impact of Open Science–a Scoping Review.” SocArXiv. https://osf.io/preprints/socarxiv/tqrwg.
Czarnitzki, Dirk, Christoph Grimpe, and Maikel Pellens. 2015. “Access to Research Inputs: Open Science Versus the Entrepreneurial University.” Journal of Technology Transfer 40 (6): 1050–63. https://doi.org/10.1007/s10961-015-9392-0.
Dasgupta, Partha, and Paul A. David. 1994. “Toward a New Economics of Science.” Research Policy 23 (5): 487–521. https://doi.org/10.1016/0048-7333(94)01002-1.
Dijstelbloem, Huub, Frank Huisman, Frank Miedema, and Wijnand Mijnhardt. 2013. “Why Science Does Not Work as It Should and What to Do about It.” Science in Transition. https://scienceintransition.nl/english.
DORA. 2012. “San Francisco Declaration on Research Assessment.” 2012. https://sfdora.org/read/.
EC DGRI. 2021. “European Commission, Directorate-General for Research and Innovation. European Research Area Policy Agenda – Overview of Actions for the Period 2022-2024.” EU Publications Office. https://data.europa.eu/doi/10.2777/52110.
EU AGA. 2023. “EU Funding Programmes 2021-2027. Annotated Grant Agreement.” April 1, 2023. https://ec.europa.eu/info/funding-tenders/opportunities/docs/2021-2027/common/guidance/aga_en.pdf.
EU Council. 2022. “Council Conclusions on Research Assessment and Implementation of Open Science.” June 10, 2022. https://www.consilium.europa.eu/media/56958/st10126-en22.pdf.
EU Horizon RIA GraspOS project. 2023. “GraspOS: Next Generation Research Assessment to Promote Open Science.” 2023. https://cordis.europa.eu/project/id/101095129.
EU Parliament and Council. 2021. “Regulation (EU) 2021/695 of the European Parliament and of the Council.” April 28, 2021. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32021R0695&qid=1705589817543.
European Commission, Directorate-General for Research and Innovation, C Cabello Valdes, B Rentier, E Kaunismaa, J Metcalfe, F Esposito, D McAllister, K Maas, K Vandevelde, and C O’Carroll. 2017. Evaluation of Research Careers Fully Acknowledging Open Science Practices – Rewards, Incentives and/or Recognition for Researchers Practicing Open Science. EU Publications Office. https://data.europa.eu/doi/10.2777/75255.
Grattarola, Florencia, Hanna Shmagun, Christopher Erdmann, Anne Cambon-Thomsen, Mogens Thomsen, and Laurence Mabile. 2024. “Gaps Between Open Science Activities and Actual Recognition Systems: Insights from an International Survey.” SocArXiv. https://doi.org/10.31235/osf.io/hru2x.
Haas, Martine R., and Sangchan Park. 2010. “To Share or Not to Share? Professional Norms, Reference Groups, and Information Withholding Among Life Scientists.” Organization Science 21 (4): 873–91. https://doi.org/10.1287/orsc.1090.0500.
Hackett, Edward J. 2008. Research Ethics. Science as a Vocation in the 1990s. The Changing Organizational Culture of Academic Science. Taylor & Francis.
Haeussler, Carolin, Lin Jiang, Jerry Thursby, and Marie Thursby. 2014. “Specific and General Information Sharing Among Competing Academic Researchers.” Research Policy 43 (3): 465–75. https://doi.org/10.1016/j.respol.2013.08.017.
Hicks, Diana, Paul Wouters, Ludo Waltman, Sarah de Rijcke, and Ismael Rafols. 2015. “The Leiden Manifesto for Research Metrics.” Nature 520 (7548): 429–31. https://doi.org/10.1038/520429a.
Latour, B., and S. Woolgar. 1979. Laboratory Life: The Construction of Scientific Facts. Sage Publications, Beverly Hills.
Manco, Alejandra. 2022. “A Landscape of Open Science Policies Research.” Sage Open 12 (4): 21582440221140358. https://doi.org/10.1177/21582440221140358.
Merton, R. K. 1973. Sociology of Science. University of Chicago Press, Chicago.
Munafò, Marcus R., Brian A. Nosek, Dorothy V. M. Bishop, Katherine S. Button, Christopher D. Chambers, Nathalie Percie Du Sert, Uri Simonsohn, Eric-Jan Wagenmakers, Jennifer J. Ware, and John P. A. Ioannidis. 2017. “A Manifesto for Reproducible Science.” Nature Human Behaviour 1 (1): 0021. https://doi.org/10.1038/s41562-016-0021.
Murray, Fiona, and Siobhán O’Mahony. 2007. “Exploring the Foundations of Cumulative Innovation: Implications for Organization Science.” Organization Science 18 (6): 1006–21.
Nelson, Andrew J. 2016. “How to Share "a Really Good Secret": Managing Sharing/Secrecy Tensions Around Scientific Knowledge Disclosure.” Organization Science 27 (2): 265–85. https://doi.org/10.1287/orsc.2015.1040.
Nosek, Brian A., George Alter, George C. Banks, Denny Borsboom, Sara D. Bowman, Steven J. Breckler, Stuart Buck, and et al. 2023. “Transparency and Openness Promotion (TOP) Guidelines.” OSF. September 27, 2023. https://osf.io/9f6gx.
Rijcke, Sarah, Clemencia Cosentino, Robin Crewe, Carlo D’Ippoliti, Shaheen Motala-Timol, Noorsaadah Binti A. Rahman, Laura Rovelli, David Vaux, and Yao Yupeng. 2023. “The Future of Research Evaluation: The Synthesis of Current Debates and Development. Discussion Paper.” https://doi.org/10.24948/2023.06.
Shibayama, Sotaro, and Yasunori Baba. 2011. “Sharing Research Tools in Academia: The Case of Japan.” Science and Public Policy 38 (8): 649–59. https://doi.org/10.3152/030234211X13122939587699.
Shibayama, Sotaro, and Cornelia Lawson. 2021. “The Use of Rewards in the Sharing of Research Resources.” Research Policy 50 (7): 104260. https://doi.org/10.1016/j.respol.2021.104260.
Tenopir, Carol, Elizabeth D. Dalton, Suzie Allard, Mike Frame, Ivanka Pjesivac, Ben Birch, Danielle Pollock, and Kristina Dorsett. 2015. “Changes in Data Sharing and Data Reuse Practices and Perceptions Among Scientists Worldwide.” PLoS One 10 (8): e0134826. https://doi.org/10.1371/journal.pone.0134826.
UHR Working Group. 2021. “NOR-CAM – a Toolbox for Recognition and Rewards in Academic Careers. Available At:” https://www.uhr.no/en/_f/p3/i86e9ec84-3b3d-48ce-8167-bbae0f507ce8/nor-cam-a-tool-box-for-assessment-and-rewards.pdf.
UNESCO. 2021. “UNESCO Recommendation on Open Science.” https://doi.org/10.54677/MNMH8546.
VNSU, NFU, KNAW, NWO, and ZonMw. 2019. “Recognition and Rewards. Room for Everyone’s Talent.” 2019. https://recognitionrewards.nl/about/position-paper/.
Wallis, Jillian C., Elizabeth Rolando, and Christine L. Borgman. 2013. “If We Share Data, Will Anyone Use Them? Data Sharing and Reuse in the Long Tail of Science and Technology.” PLoS One 8 (7): e67332. https://doi.org/10.1371/journal.pone.0067332.
Walsh, John P., Wesley M. Cohen, and Charlene Cho. 2007. “Where Excludability Matters: Material Versus Intellectual Property in Academic Biomedical Research.” Research Policy 36 (8): 1184–1203. https://doi.org/10.1016/j.respol.2007.04.006.
Wilkinson, Mark D., Michel Dumontier, IJsbrand Jan Aalbersberg, Gabrielle Appleton, Myles Axton, Arie Baak, Niklas Blomberg, et al. 2016. “The FAIR Guiding Principles for Scientific Data Management and Stewardship.” Scientific Data 3 (1): 160018. https://doi.org/10.1038/sdata.2016.18.
Wilsdon, James, Liz Allen, Eleonora Belfiore, Philip Campbell, Stephen Curry, Steven Hill, Richard Jones, et al. 2015. “The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management Wilsdon.” https://doi.org/10.13140/RG.2.1.4929.1363.
Working group for responsible evaluation of a researcher. 2020. “Good Practice in Researcher Evaluation. Recommendation for the Responsible Evaluation of a Researcher in Finland.” The Committee for Public Information (TJNK); Federation of Finnish Learned Societies (TSV). https://avointiede.fi/sites/default/files/2020-03/responsible-evalution.pdf.
Wouters, Paul, Mike Thelwall, Kayvan Kousha, Ludo Waltman, Sarah de Rijcke, Alex Rushforth, and Thomas Franssen. 2015. “The Metric Tide: Literature Review (Supplementary Report i to the Independent Review of the Role of Metrics in Research Assessment and Management).” 10.13140/RG.2.1.5066.3520.