One of the pathways by which the scientific community confirms the validity of a new scientific discovery is by repeating the research that produced it. When a scientific effort fails to independently confirm the computations or results of a previous study, some fear that it may be a symptom of a lack of rigor in science, while others argue that such an observed inconsistency can be an important precursor to new discovery.
Concerns about reproducibility and replicability have been expressed in both scientific and popular media. As these concerns came to light, Congress requested that the National Academies of Sciences, Engineering, and Medicine conduct a study to assess the extent of issues related to reproducibility and replicability and to offer recommendations for improving rigor and transparency in scientific research.
Reproducibility and Replicability in Science defines reproducibility and replicability and examines the factors that may lead to non-reproducibility and non-replicability in research. Unlike the typical expectation of reproducibility between two computations, expectations about replicability are more nuanced, and in some cases a lack of replicability can aid the process of scientific discovery. This report provides recommendations to researchers, academic institutions, journals, and funders on steps they can take to improve reproducibility and replicability in science.
Table of Contents |
skim chapter | |
---|---|---|
Front Matter | i-xxii | |
Executive Summary | 1-4 | |
Summary | 5-20 | |
1 Introduction | 21-26 | |
2 Scientific Methods and Knowledge | 27-38 | |
3 Understanding Reproducibility and Replicability | 39-54 | |
4 Reproducibility | 55-70 | |
5 Replicability | 71-104 | |
6 Improving Reproducibility and Replicability | 105-142 | |
7 Confidence in Science | 143-162 | |
References | 163-188 | |
Appendix A: Biographical Sketches of Committee Members and Staff | 189-198 | |
Appendix B: Agendas of Open Committee Meetings | 199-208 | |
Appendix C: Recommendations Grouped by Stakeholder | 209-220 | |
Appendix D: Using Bayes Analysis for Hypothesis Testing | 221-230 | |
Appendix E: Conducting Replicable Surveys of Scientific Communities | 231-234 |
The National Academies Press and the Transportation Research Board have partnered with Copyright Clearance Center to offer a variety of options for reusing our content. You may request permission to:
For most Academic and Educational uses no royalties will be charged although you are required to obtain a license and comply with the license terms and conditions.
Click here to obtain permission forReproducibility and Replicability in Science.
For information on how to request permission to translate our work and for any other rights related query please click here.
For questions about using the Copyright.com service, please contact:
Copyright Clearance Center
22 Rosewood Drive
Danvers, MA 01923
Tel (toll free): 855/239-3415 (select option 1)
E-mail: [email protected]
Web: https://www.copyright.com
Loading stats for Reproducibility and Replicability in Science...