To research performing organisations

To gain insight and learn how to support OS activities, institutions should first join RRA and OS related communities/initiatives (e.g., CoARA, RDA) and encourage their personnel to be active in them. Formal OS policies should be adopted and posted on institutional websites, ideally in a discoverable and usable format (e.g. human and machine readable), and communicated to the communities they serve. Important to these policy measures, research outputs should be deposited in community trusted repositories (e.g., institutionally supported repositories, CoreTrustSeal) and made publicly available and reusable under permissive licences. To make these outputs fully reusable, a data management plan (DMP) should be required for all research projects and FAIR principles should be applied as much as possible. In particular, all publications (co)-authored by researchers/staff and students should contain ‘Data Availability Statements’ and data citation references (which applies to other research outputs such as software).

Furthermore, OS practices expected by a policy should be monitored and rewarded, implying that they should be considered as part of criteria for recruitment and evaluation. A prerequisite for OS monitoring is engagement with persistent identifier (PID) infrastructures, such as Datacite which enables tracking OS activities and outputs through relevant metadata. Even though openly shared datasets, software, protocols, and other research outputs are increasingly accompanied with Digital Object Identifiers (DOIs) and can be tracked, these efforts are not always fully credited as part of research evaluation and recruitment procedures. There is a need to develop new metrics and indicators for evaluating OS practices, aligning with principles of openness, transparency, and collaboration, and thereby crediting the creator. Assessing scientific production traditionally relies on citation-based metrics from databases like Web of Science, or Google Scholar. However, further discussions in the research community have moved beyond traditional metrics (from PubMed Medline, Scopus etc.; Tag: Data-level metrics) and have explored alternative approaches potentially more suited to OS activities (Das 2015; Ugwu Okechukwu et al. 2023; Bosman, Debackere, and Cawthorn 2024).

Capacity building is critical to implement OS policies. Improvements in OS capacity building should be made by incorporating OS education into research workflows (such as in curricula, training programs, and working groups), so as to become part of the culture. Infrastructures and material resources for OS such as providing digital services and tools should be facilitated by institutions (e.g., FAIR data management service, DMP tools, tools for anonymization, and guidance towards trusted repositories). Notably, OS practice should be facilitated and streamlined by services wherever relevant such as automated metadata completion via persistent identifiers and transfer and communication of copyrights and intellectual property rights should be retained to comply with OA and OS requirements.

Another important aspect is the financial support for OS, including PID-related costs such as DOI registration for all research outputs such as datasets, costs associated with research data/software management, investments in national/regional OS initiatives such as Diamond OA. In order to support OS activities, it is important to include related costs in funding applications, create funding opportunities to work with relevant OS communities, and establish other incentives for OS activities. Various types of OS rewarding solutions need to be explored and implemented, ranging from awards, salary bonuses, champions, badging schemes, to additional free time (e.g., sabbaticals), depending on context. These should also be integrated and recognized as part of recruitment, promotion and tenure schemes (e.g., recognizing open access to research outputs). Token recognition systems (e.g. blockchain backed) are also emerging as a new opportunity to reward the contributions that academics make to the scientific ecosystem. Academics would be awarded token bounties for undertaking common but vital tasks such as peer review, committee work, and writing reports (Finke and Hensel 2024). These tokens would serve as a validated record of scientific contribution that could be used in the research evaluation scheme. This adds to already present citation mechanisms, including data, software, and other research outputs as recognition.


Table 1: Recommendations to Research performing organisations

References

Bosman, Jeroen, Koenraad Debackere, and William Cawthorn. 2024. “Next Generation Metrics for Scientific and Scholarly Research in Europe.” https://doi.org/10.5281/zenodo.11123148.
Das, Anup Kumar. 2015. Research Evaluation Metrics. UNESCO. https://unesdoc.unesco.org/ark:/48223/pf0000232210.
Finke, Andreas, and Thomas Hensel. 2024. “Decentralized Peer Review in Open Science: A Mechanism Proposal.” https://doi.org/10.48550/arXiv.2404.18148.
Ugwu Okechukwu, Paul-Chima, Nnenna Ugwu Jovita, Ugo Alum Esther, and Emmanuel I. Obeagu. 2023. “Redefining Academic Performance Metrics: Evaluating the Excellence of Researchers, Academics, and Scholars” 4 (1): 36–42. https://doi.org/10.59298/NIJSES/2023/10.5.1000.