Reproducibility is at the core of solid scientific and technical research. The ability to repeat the research that is produced is a key approach for confirming the validity of a new scientific discovery. The TPDS Reproducibility Initiative currently focuses on reproducible research through transparency and the availability and potential reuse of code and data. TPDS has partnered with Code Ocean, a cloud-based computational reproducibility platform, to pilot the post-publication peer review of code associated with articles published in TPDS. Authors who have published in TPDS can make their published article more reproducible and earn a reproducibility badge by submitting their associated code for post-publication peer review. TPDS is the first IEEE transactions to pilot reproducibility badging.
TPDS offers two badges:
- Code Available: The code, including any associated data and documentation, provided by the authors is reasonable and complete and can potentially be used to support reproducibility of the published results.
- Code Reviewed: The code, including any associated data and documentation, provided by the authors is reasonable and complete, runs to produce the outputs described, and can support reproducibility of the published results.
Call for Supplemental Papers Evaluating Reproducibility
The next step in the TPDS reproducibility initiative invites authors to submit supplemental papers that present their experiences in replicating published results using the artifacts and/or evaluations or experiences with published artifacts. These supplemental paper submissions will be reviewed and, if accepted, linked to the original publication and citable.
List of Papers with Badges
R. Tolosana-Calasanz, J. Diaz-Montes, O. F. Rana, and M. Parashar, “Feedback-Control & Queueing Theory-Based Resource Management for Streaming Applications,” IEEE Transactions on Parallel and Distributed Systems, vol. 28, no. 4, pp. 1061-1075, 1 April 2017. DOI: 10.1109/TPDS.2016.2603510