ReScience C
ReScience C is a journal created in 2015 by Nicolas Rougier and Konrad Hinsen with the aim of publishing researchers' attempts to replicate computations made by other authors, using independently written, free and open-source software (FOSS), with an open process of peer review.[1] The journal states that requiring the replication software to be free and open-source ensures the reproducibility of the original research.[3]
Discipline | Reproducibility |
---|---|
Language | English |
Edited by | Olivia Guest, Benoît Girard, Konrad Hinsen, Nicolas Rougier[1] |
Publication details | |
History | 2015–present[1] |
Publisher | |
diamond/platinum | |
License | CC BY 4.0[2] |
Standard abbreviations | |
ISO 4 | ReSci. C |
Indexing | |
ISSN | 2430-3658 |
Links | |
|
Creation
ReScience C was created in 2015 by Nicolas Rougier and Konrad Hinsen in the context of the replication crisis of the early 2010s, in which concern about difficulty in replicating (different data or details of method) or reproducing (same data, same method) peer-reviewed, published research papers was widely discussed.[4] ReScience C's scope is computational research, with the motivation that journals rarely require the provision of source code, and when source code is provided, it is rarely checked against the results claimed in the research article.[5]
Policies and methods
The scope of ReScience C is mainly focussed on researchers' attempts to replicate computations made by other authors, using independently written, free and open-source software (FOSS).[1] Articles are submitted using the "issues" feature of a git repository run by GitHub, together with other online archiving services, including Zenodo and Software Heritage. Peer review takes place publicly in the same "issues" online format.[2]
In 2020, Nature reported on the results of ReScience C's "Ten Years' Reproducibility Challenge", in which scientists were asked to try reproducing the results from peer-reviewed articles that they had published at least ten years earlier, using the same data and software if possible, updated to a modern software environment and free licensing.[1] As of 24 August 2020, out of 35 researchers who had proposed to reproduce the results of 43 of their old articles, 28 reports had been written, 13 had been accepted after peer review and published, among which 11 documented successful reproductions.[1]
References
- Perkel, Jeffrey M. (2020-08-24). "Challenge to scientists: does your ten-year-old code still run?". Nature. 584 (7822): 656–658. doi:10.1038/d41586-020-02462-7. Archived from the original on 2020-08-24. Retrieved 2020-08-31.
- "Reproducible Science is good. Replicated Science is better". GitHub. 2020. Archived from the original on 2020-08-31. Retrieved 2020-08-31.
- Pashler, Harold; Wagenmakers, Eric Jan (2012). "Editors' Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence?". Perspectives on Psychological Science. 7 (6): 528–530. doi:10.1177/1745691612465253. PMID 26168108. S2CID 26361121.
- Rougier, Nicolas P.; Hinsen, Konrad (2017-12-18). "Sustainable computational science: the ReScience initiative". PeerJ Computer Science. 3: e142. arXiv:1707.04393. Bibcode:2017arXiv170704393R. doi:10.7717/peerj-cs.142. ISSN 2376-5992. S2CID 7392801. Archived from the original on 2020-08-31. Retrieved 2020-08-31.