Authors Filippo Vasone and Mario Marino are Data Stewards at the Alma Mater Research Institute for Human-Centered Artificial Intelligence (Alma AI), University of Bologna, Italy: https://centri.unibo.it/alma-ai/en.
At the beginning of the XVII century, when Johannes Kepler was working under the Dutch astronomist Tycho Brahe in Bohemia, he recognized the great value of the decades of astronomical data Brahe had collected. However, Brahe was extremely protective of his data and did not give him full access to them during his lifetime (Resnik, 2023). While this episode may now make us smile, the tension between secrecy – or closure – and openness in science is still well alive. Recently, most notably, discourses around science policy have been focusing on the concept of research security.
Unfortunately, it is impossible to find a single definition of research security, since it is a very broad topic that is difficult to standardize as it carries different meanings depending on the field. Different national and international institutions have different definitions of research security. For example, according to the European Union, research security deals with risks regarding the transfer of knowledge that affects the security of the Union and its States or that – used unethically – suppresses the fundamental rights and values held by the Union (European Council, 2024, art. 18.1). The G7 (SIGRE Working Group, 2024, p.3) states that research security aims at protecting research communities from actors and behaviors posing economic, national and international security risks.
An Open Science (OS) advocate – like we are – may fear that the discourse on research security contrasts with fundamental OS values and principles, such as participation, collective benefit, inclusiveness, and diversity (UNESCO, 2021). The institutional and political push towards both OS and research security appears like an unhappy marriage, reliving the essential tension between secrecy and openness in scientific research.
Yet, we hold that it is possible – and much needed! – to navigate this fraught territory in order to integrate research security within OS. By enriching our understanding of openness, we hope to ease the tension between research security and OS and eventually push forward the values that originally guided the OS movement. In line with other scholars working on OS (such as Leonelli, 2023), we believe that OS is first and foremost a responsible science, which understands and reacts to the social context in which it is embedded.
Thus, for Open Science, the responsibility of the scientist is central. In our work as data stewards at the University of Bologna, one of our main missions is to promote the accountability of researchers regarding research data management (RDM).
Both OS and RDM contribute to making science more transparent, efficient, reproducible, responsible, and trustworthy. Open Research Data (ORD) are an essential part of OS practices, as well as those data which, in addition to being managed through good RDM practices, meet four macro-principles of technical quality: findability, accessibility, interoperability and reusability (FAIR data). Correctly managed FAIR data are not necessarily ORD, nor can it be assumed that ORD have been properly managed. We believe that good science should aim at the intersection between FAIR data and ORD. This goal can be achieved by increasing researchers’ accountability – which is a process that needs professional support.
To increase researchers’ accountability, our approach hinges on enhancing researchers’ self-assessment capabilities in an integrated way. In the past years, a decision tree has been developed to provide researchers with an essential set of questions for responsible RDM (Caldoni, Gualandi, Marino, 2022). The questions not only focused on FAIR principles, but also on privacy and intellectual property rights, in order to help researchers find the intersection between OS and RDM. The tree has been happily welcomed by researchers while we support them with RDM, for example in the projects funded by Horizon Europe.
In a recent update to the RDM decision tree, presented at the Open Science Fair 2025 at CERN, we integrated security-related questions. We operationalized the abstract definitions of research security, by breaking down its meaning into three main dimensions: the safety of people involved in research (participants and researchers), the dual-use risks, and cybersecurity.
We then articulated the security-related questions by collecting questions and information from national and international (both European and global) guidelines, along with scientific literature on research security. Furthermore, we thought through the questions thanks to two challenging case studies eliciting security concerns: first, dual-use risk research in the case of algorithms for drone navigation; second, anticorruption research in dangerous contexts.
In this way, the decision tree has become the single – integrated – instrument to help researchers lay the foundations of responsible RDM. Similarly, we envision data stewards as single points of contact within research services to strengthen accountability in research practices through responsible RDM. This integrated approach thus enables OS and research security not only to coexist but to complement each other under the guidance of a scientific community whose members are able to take full responsibility for their role in society.
The poster contribution to the Open Science Fair 2025 is available here: 10.5281/zenodo.17193163; while the demo dashboard displaying the decision tree is available here: 10.5281/zenodo.17193536.
We are interested in knowing what other researchers, institutions, and research services professionals think regarding Open Science and research security. If you would like to share your perspective with us or give feedback on our tool, we are happy to engage with you! You can contact us at aric.datasteward@unibo.it.
References
Caldoni, G., Gualandi, B., & Marino, M. (2022). Research Data Management Decision Tree. Zenodo. https://doi.org/10.5281/ZENODO.7190004
European Council. (2024). Council Recommendation of 23 May 2024 on enhancing research security. (C/2024/3510). https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ:C_202403510
Leonelli, S. (2023). Philosophy of open science. Cambridge University Press. https://doi.org/10.1017/9781009416368
Resnik D. (2023). Openness in Scientific Research: A Historical and Philosophical Perspective. Journal of open access to law, 11(1), 132.
SIGRE (Security and Integrity of the Global Research Ecosystem) Working Group. (2024). G7 Best Practices for Secure and Open Research, G7, URL: https://science.gc.ca/site/science/sites/default/files/documents/1136-g7-best-practices-for-secure-and-open-research-october-2024.pdf
UNESCO. (2021). UNESCO Recommendation on Open Science. UNESCO. https://doi.org/10.54677/MNMH8546

Leave a Reply