Introduction

Why open science is important

Science is a cumulative process (Merton, 1973) that relies on previous knowledge considering all types of research outputs (Dasgupta & David, 1994; Walsh, Cohen, & Cho, 2007). Although sharing research outputs as common goods should be the common rule, this is actually not the case.

The Open Science (OS) movement was forged in response to this concern. It refers to a range of activities (Grattarola et al., 2024) including sharing research outputs. OS enables replication, improves productivity, limits redundancy, and helps create more robust research methods and a rich network of resources, thus increasing research efficiency (Murray & O’Mahony, 2007; Shibayama & Baba, 2011; Walsh et al., 2007). In the end, it contributes to the collective building of scientific knowledge and societal progress (Cole, Klebel, Apartis, & Ross-Hellauer, 2024).

How modern science is recognised

To appreciate any contribution to science, credit and recognition are a prerequisite to any ‘reward mechanism’ and need to be mapped in the overall research assessment scheme. Crediting is the explicit recognition for one’s contribution to a work, the process whereby the origin of a scientific work is attributed to an individual, a group of individuals or an institution (Merton, 1973; Shibayama & Baba, 2011; Walsh et al., 2007). It is the first step in recognising the value of one’s work and is generally quantified by a series of metrics. It is an important process which builds scientists’ reputation. Crediting can be seen as a milestone in the process of rewarding which encompasses several elements such as academic promotion, grants, dedicated staff and support materials that help produce subsequent discoveries (Latour & Woolgar, 1979; Shibayama & Lawson, 2021). In the case of published discoveries, credit is allocated by the community through attribution, peer review approval, and citation. It can also come from patenting in some specific cases (ALLEA, 2023). Sharing intermediate or pre-publication outputs is however far less established as it is more complex and does not necessarily fit into the conventional crediting system of science (Shibayama & Lawson, 2021). A number of studies have underlined that academic research fails to recognise, value, and reward efforts to open up the scientific process (Hicks, Wouters, Waltman, Rijcke, & Rafols, 2015; Munafò et al., 2017; Wilsdon et al., 2015; Wouters et al., 2015). Yet it is crucial that these activities be valued as they require considerable time, energy and expertise to make outputs findable and accessible and for data and software to be compliant with international standards making them interoperable and reusable by others, as stated by the FAIR principles (Wilkinson et al., 2016).

The current sharing practice of academics

In the ‘publish or perish’ culture, some outputs (such as data, databases, or algorithms) may provide academics with an advantage under high competition which can lead them not to share those (Dasgupta & David, 1994; Haas & Park, 2010; Haeussler, Jiang, Thursby, & Thursby, 2014; Merton, 1973). Moreover, some commercialisation contexts, regulatory constraints, privacy issues or data reuse concerns as well as shortage of funds, lack of time or of capacities and technical resources, could also be barriers (Haas & Park, 2010; Walsh et al., 2007). As a result, the amount of outputs shared through open mechanisms is still limited in many communities or disciplines, and a lot of resources are shared in one-to-one transactions (Shibayama & Baba, 2011; Tenopir et al., 2015; Wallis, Rolando, & Borgman, 2013). Thus, the degree of openness is still mainly at the discretion of individual academics (Blume, 1974; Hackett, 2008; Nelson, 2016). However, academics broadly agree that open sharing is beneficial to science and numerous studies showed that when requested, it is respected (Czarnitzki, Grimpe, & Pellens, 2015; Haas & Park, 2010; Shibayama & Baba, 2011; Walsh et al., 2007). Now a clear consensus on how outputs should be shared and rewarded needs to be established.

The current normative incentives for sharing

Since the Budapest Declaration (BOAI, 2002), that specifically propelled the Open Access (OA) concept, governments, funders, research organisations and publishers are increasingly adopting formal OS policies (Manco, 2022) with a primary focus on OA to publications and underlying research data. Pioneers in adopting such policies include, for example, Scientific Electronic Library Online (SciELO), the National Institutes of Health (NIH), White House’s Office of Science and Technology Policy (OSTP), Gates Foundation, Wellcome Trust, UK research councils, Harvard University and Queensland University of Technology. In spite of this, the sharing activities are often not sufficiently recognised or credited in formal assessments of researchers and project proposals, discouraging researchers from engaging in sharing activities (Arthur et al., 2021).

Efforts to address this challenge have led to the rise of several initiatives within the Responsible Research Assessment (RRA) movement. Notable examples are the DORA declaration (DORA, 2012), the Dutch initiative ‘Science in Transition’ (Dijstelbloem, Huisman, Miedema, & Mijnhardt, 2013), the Leiden Manifesto (Hicks et al., 2015) and the Metric Tide (Wilsdon et al., 2015). While these initiatives have not directly focused on recognising and rewarding OS practices, they have significantly contributed to promoting responsible metrics and the assessment of a broad spectrum of research outputs, including datasets and software.

The most notable initiatives that explicitly incorporated OS into the RRA discourse have emerged in the European Union. For example, the European Commission established a Working Group in 2016 on rewards under OS that formulated the OS Career Assessment Matrix (OS-CAM), suggesting various criteria for incorporating OS activities in the formal evaluations of researchers at all career stages (European Commission, Directorate-General for Research and Innovation et al., 2017). Moreover, the European Research Area Policy Agenda for 2022-2024 (EC DGRI, 2021) has set the transformation of research assessment systems as a priority strategic action, including the rewarding of OS practices as part of this necessary change, further supported by the Conclusions of the Council of the European Union on Research Assessment and Implementation of OS (EU Council, 2022). In line with this agenda, the Coalition for Advancing Research Assessment (CoARA, 2022) was formed in 2022, as an initiative from several European organisations, bringing together stakeholders from the research ecosystem across 164 countries to enhance and harmonise research assessment practices, with an emphasis on recognising and rewarding behaviours underpinning OS activities. Additionally, the Horizon Europe programme has incorporated OS into its evaluation of all research proposals and project assessments, showcasing a prime example of how these practices can be embedded in funder evaluation schemes (EU AGA, 2023; EU Parliament and Council, 2021). Also, the ongoing Horizon Europe projects ‘GraspOS’ (EU Horizon RIA GraspOS project, 2023) and ‘OPUS’ (project, 2022) have been specifically designed to support the reforms of research assessment systems that include OS practices. Lastly, cOAlition S funders, including the European Commission, have recently introduced a proposal named ‘Towards Responsible Publishing’ (cOAlition S, 2023), which calls for the incorporation of OS practices into funders’ assessment policies and the elimination of journal metrics in the evaluation of researchers.

Some countries have also initiated steps to integrate OS practices into their research assessment schemes, with notable efforts seen in the Netherlands (VNSU, NFU, KNAW, NWO, & ZonMw, 2019), France (CNRS, 2019), Norway (UHR Working Group, 2021), Finland (Working group for responsible evaluation of a researcher, 2020) and in Latin America and the Caribbean (CLACSO, 2019). Details and more initiatives are given by Rijcke et al. (2023). Simultaneously, bottom-up international initiatives have emerged to explore more immediate ways to assess and credit OS activities. Notably, the data science community is getting organised under the umbrella of the Research Data Alliance (RDA) to articulate related concerns and offer recommendations (e.g., RDA-EoR IG, 2023; RDA-SHARC IG, 2017; and CODATA WG, 2024).

Objective

In this paper, we provide a set of recommendations developed by the RDA-SHARC interest group to help implement various rewarding schemes for opening up science. These recommendations specifically emphasise the need to include sharing activities in research evaluation schemes as an overarching, valuable, and hopefully efficient mechanism to promote OS practices. The recommendations target a broad range of stakeholders in research and innovation systems, as highlighted by the UNESCO Recommendation on OS (UNESCO, 2021), emphasising the collaborative effort of individual researchers, research institutions and any organisation performing research (public and private), funders, government policymakers and publishers in transforming the research culture towards OS (Nosek et al., 2023).

References

ALLEA. (2023). The european code of conduct for research integrity -revised edition 2023. Berlin. Retrieved from https://allea.org/code-of-conduct/
Arthur, P. L., Hearn, L., Montgomery, L., Craig, H., Arbuckle, A., & Siemens, R. (2021). Open scholarship in australia: A review of needs, barriers, and opportunities. Digital Scholarship in the Humanities, 36(4), 795–812. https://doi.org/10.1093/llc/fqaa063
Blume, S. S. (1974). Toward a political sociology of science. New York, Free Press.
BOAI. (2002). Budapest open access initiative. Retrieved January 18, 2024, from https://www.budapestopenaccessinitiative.org/read/
CLACSO. (2019). FOLEC. Retrieved January 18, 2024, from https://www.clacso.org/folec/
CNRS. (2019). CNRS roadmap for open science. Retrieved from https://www.science-ouverte.cnrs.fr/wp-content/uploads/2019/11/CNRS_Roadmap_Open_Science_18nov2019.pdf
cOAlition S. (2023). Plan s. Towards responsible publishing. Retrieved January 18, 2024, from https://www.coalition-s.org/towards-responsible-publishing/
CoARA. (2022). Coalition for advancing research assessment. Retrieved January 18, 2024, from https://coara.eu/
CODATA WG. (2024). CODATA Working Groups. Retrieved from https://codata.org/initiatives/working-groups/
Cole, N. L., Klebel, T., Apartis, S., & Ross-Hellauer, T. (2024). The societal impact of open science–a scoping review. Retrieved from https://osf.io/preprints/socarxiv/tqrwg
Czarnitzki, D., Grimpe, C., & Pellens, M. (2015). Access to research inputs: Open science versus the entrepreneurial university. Journal of Technology Transfer, 40(6), 1050–1063. https://doi.org/10.1007/s10961-015-9392-0
Dasgupta, P., & David, P. A. (1994). Toward a new economics of science. Research Policy, 23(5), 487–521. https://doi.org/10.1016/0048-7333(94)01002-1
Dijstelbloem, H., Huisman, F., Miedema, F., & Mijnhardt, W. (2013). Why science does not work as it should and what to do about it. Science in Transition. Retrieved from Science in Transition website: https://scienceintransition.nl/english
DORA. (2012). San francisco declaration on research assessment. Retrieved January 18, 2024, from https://sfdora.org/read/
EC DGRI. (2021). European commission, directorate-general for research and innovation. European research area policy agenda – overview of actions for the period 2022-2024. EU Publications Office. Retrieved from EU Publications Office website: https://data.europa.eu/doi/10.2777/52110
EU AGA. (2023, April 1). EU funding programmes 2021-2027. Annotated grant agreement. Retrieved January 18, 2024, from https://ec.europa.eu/info/funding-tenders/opportunities/docs/2021-2027/common/guidance/aga_en.pdf
EU Council. (2022, June 10). Council conclusions on research assessment and implementation of open science. Retrieved January 18, 2024, from https://www.consilium.europa.eu/media/56958/st10126-en22.pdf
EU Horizon RIA GraspOS project. (2023). GraspOS: Next generation research assessment to promote open science. Retrieved January 18, 2024, from https://cordis.europa.eu/project/id/101095129
EU Parliament and Council. (2021, April 28). Regulation (EU) 2021/695 of the european parliament and of the council. Retrieved January 18, 2024, from https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32021R0695&qid=1705589817543
European Commission, Directorate-General for Research and Innovation, Cabello Valdes, C., Rentier, B., Kaunismaa, E., Metcalfe, J., Esposito, F., … O’Carroll, C. (2017). Evaluation of research careers fully acknowledging open science practices – rewards, incentives and/or recognition for researchers practicing open science. EU Publications Office. Retrieved from https://data.europa.eu/doi/10.2777/75255
Grattarola, F., Shmagun, H., Erdmann, C., Cambon-Thomsen, A., Thomsen, M., & Mabile, L. (2024). Gaps between open science activities and actual recognition systems: Insights from an international survey. https://doi.org/10.31235/osf.io/hru2x
Haas, M. R., & Park, S. (2010). To share or not to share? Professional norms, reference groups, and information withholding among life scientists. Organization Science, 21(4), 873–891. https://doi.org/10.1287/orsc.1090.0500
Hackett, E. J. (2008). Research ethics. Science as a vocation in the 1990s. The changing organizational culture of academic science. Taylor & Francis.
Haeussler, C., Jiang, L., Thursby, J., & Thursby, M. (2014). Specific and general information sharing among competing academic researchers. Research Policy, 43(3), 465–475. https://doi.org/10.1016/j.respol.2013.08.017
Hicks, D., Wouters, P., Waltman, L., Rijcke, S. de, & Rafols, I. (2015). The leiden manifesto for research metrics. Nature, 520(7548), 429–431. https://doi.org/10.1038/520429a
Latour, B., & Woolgar, S. (1979). Laboratory life: The construction of scientific facts. Sage Publications, Beverly Hills.
Manco, A. (2022). A landscape of open science policies research. Sage Open, 12(4), 21582440221140358. https://doi.org/10.1177/21582440221140358
Merton, R. K. (1973). Sociology of science. University of Chicago Press, Chicago.
Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie Du Sert, N., … Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 0021. https://doi.org/10.1038/s41562-016-0021
Murray, F., & O’Mahony, S. (2007). Exploring the foundations of cumulative innovation: Implications for organization science. Organization Science, 18(6), 1006–1021.
Nelson, A. J. (2016). How to share "a really good secret": Managing sharing/secrecy tensions around scientific knowledge disclosure. Organization Science, 27(2), 265–285. https://doi.org/10.1287/orsc.2015.1040
Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., … al., et. (2023, September 27). Transparency and openness promotion (TOP) guidelines [OSF]. Retrieved January 18, 2024, from https://osf.io/9f6gx
project, O. and U. S. (OPUS). (2022). EU - Open and Universal Science (OPUS) project. Retrieved from https://opusproject.eu/
RDA-EoR IG. (2023). Research Data Alliance-Evaluation of Research Interest Group. Retrieved from https://www.rd-alliance.org/groups/evaluation-research-ig/
RDA-SHARC IG. (2017). Research Data Alliance - SHAring Rewards & Credit (SHARC) Interest Group. Retrieved from https://www.rd-alliance.org/groups/sharing-rewards-and-credit-sharc-ig
Rijcke, S., Cosentino, C., Crewe, R., D’Ippoliti, C., Motala-Timol, S., Rahman, N. B. A., … Yupeng, Y. (2023). The future of research evaluation: The synthesis of current debates and development. Discussion paper. https://doi.org/10.24948/2023.06
Shibayama, S., & Baba, Y. (2011). Sharing research tools in academia: The case of japan. Science and Public Policy, 38(8), 649–659. https://doi.org/10.3152/030234211X13122939587699
Shibayama, S., & Lawson, C. (2021). The use of rewards in the sharing of research resources. Research Policy, 50(7), 104260. https://doi.org/10.1016/j.respol.2021.104260
Tenopir, C., Dalton, E. D., Allard, S., Frame, M., Pjesivac, I., Birch, B., … Dorsett, K. (2015). Changes in data sharing and data reuse practices and perceptions among scientists worldwide. PLoS One, 10(8), e0134826. https://doi.org/10.1371/journal.pone.0134826
UHR Working Group. (2021). NOR-CAM – a toolbox for recognition and rewards in academic careers. Available at: Retrieved from https://www.uhr.no/en/_f/p3/i86e9ec84-3b3d-48ce-8167-bbae0f507ce8/nor-cam-a-tool-box-for-assessment-and-rewards.pdf
UNESCO. (2021). UNESCO recommendation on open science. Retrieved from https://doi.org/10.54677/MNMH8546
VNSU, NFU, KNAW, NWO, & ZonMw. (2019). Recognition and rewards. Room for everyone’s talent. Retrieved January 18, 2024, from https://recognitionrewards.nl/about/position-paper/
Wallis, J. C., Rolando, E., & Borgman, C. L. (2013). If we share data, will anyone use them? Data sharing and reuse in the long tail of science and technology. PLoS One, 8(7), e67332. https://doi.org/10.1371/journal.pone.0067332
Walsh, J. P., Cohen, W. M., & Cho, C. (2007). Where excludability matters: Material versus intellectual property in academic biomedical research. Research Policy, 36(8), 1184–1203. https://doi.org/10.1016/j.respol.2007.04.006
Wilkinson, M. D., Dumontier, M., Aalbersberg, Ij. J., Appleton, G., Axton, M., Baak, A., … Mons, B. (2016). The FAIR guiding principles for scientific data management and stewardship. Scientific Data, 3(1), 160018. https://doi.org/10.1038/sdata.2016.18
Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., … Johnson, B. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management wilsdon. https://doi.org/10.13140/RG.2.1.4929.1363
Working group for responsible evaluation of a researcher. (2020). Good practice in researcher evaluation. Recommendation for the responsible evaluation of a researcher in finland. The Committee for Public Information (TJNK); Federation of Finnish Learned Societies (TSV). Retrieved from The Committee for Public Information (TJNK); Federation of Finnish Learned Societies (TSV) website: https://avointiede.fi/sites/default/files/2020-03/responsible-evalution.pdf
Wouters, P., Thelwall, M., Kousha, K., Waltman, L., Rijcke, S. de, Rushforth, A., & Franssen, T. (2015). The metric tide: Literature review (supplementary report i to the independent review of the role of metrics in research assessment and management). Retrieved from 10.13140/RG.2.1.5066.3520