CURE’s mission is to support curation of research data and review of code and associated digital scholarly objects for the purpose of facilitating the digital preservation of the evidence base necessary for future understanding, evaluation, and reproducibility of scientific claims. We do this through establishing standards, sharing practices, and promoting the philosophy of Data Quality Review.
Building on existing and evolving standards, CURE is dedicated to establishing and communicating the driving principles and criteria for proper curation for reproducibility.
Curating for reproducibility involves multiple tasks and several stakeholders. A primary goal of CURE is to map the vital elements of the workflow, and to share the best practices that have emerged within each organization.
Promoting Data Quality Review
CURE members believe that pre-publication data quality review is essential for the progression of science and preservation of knowledge.
1. TRANSPARENCY, ACCESS, AND TRUST.
All research objects underlying published or reported findings should be made available to the scientific community (subject to applicable legal, regulatory, contractual, and ethical obligations) and deposited with a trusted repository.
All research objects underlying published or reported findings must be usable and independently understandable for the long term. That is, data should be well-documented and usable, code should execute properly and interact with input data as reported by the researcher.
Steps should be taken to ensure that published or reported computational findings and analyses can be reproduced on an independent computational system and by independent third parties.
To the extent possible, independent reproduction of computational findings and analyses should take place prior to publication.
*These principles acknowledge and build upon the following extant work in this area:
- Data Citaton Synthesis Group. (2014). Joint Declaration of Data Citation Principles. (M. Martone, Ed.). FORCE11. Retrieved from https://www.force11.org/group/joint-declaration-data-citation-principles-final
- King, G. (1995). Replication, replication. PS: Political Science & Politics, 28(3), 444–452. https://doi.org/10.2307/420301
- Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., … Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 0021. https://doi.org/10.1038/s41562-016-0021
- Peer, L., Green, A., & Stephenson, E. (2014). Committing to data quality review. International Journal of Digital Curation, 9(1). https://doi.org/10.2218/ijdc.v9i1.317
- Smith, A. M., Katz, D. S., Niemeyer, K. E., & FORCE11 Software Citation Working Group. (2016). Software citation principles. PeerJ Computer Science, 2, e86. https://doi.org/10.7717/peerj-cs.86
- Stodden, V. (2009). Enabling reproducible research: Open licensing for scientific innovation. International Journal of Communications Law and Policy, (13), 22–47. Retrieved from https://www.stanford.edu/~vcs/papers/ERROLSI03092009.pdf
- Stodden, V., McNutt, M., Bailey, D. H., Deelman, E., Gil, Y., Hanson, B., … Taufer, M. (2016). Enhancing reproducibility for computational methods. Science, 354(6317), 1240–1241. https://doi.org/10.1126/science.aah6168
- Wilkinson, M. D., Dumontier, M., Aalbersberg, Ij. J., Appleton, G., Axton, M., Baak, A., … Mons, B. (2016). The FAIR Guiding Principles for scientific data management and stewardship. Scientific Data, 3, 160018. https://doi.org/10.1038/sdata.2016.18