10 JANUARY 2017 | Marcus R. Munafò, Brian A. Nosek, Dorothy V. M. Bishop, Katherine S. Button, Christopher D. Chambers, Nathalie Peric du Sert, Uri Simonsohn, Eric-Jan Wagenmakers, Jennifer J. Ware, John P. A. Ioannidis
The article "A manifesto for reproducible science" by Munafò et al. addresses the need to improve the reliability and efficiency of scientific research. The authors argue for the adoption of measures to optimize key elements of the scientific process, including methods, reporting and dissemination, reproducibility, evaluation, and incentives. They highlight the prevalence of issues such as low sample size, small effect sizes, data dredging, conflicts of interest, and competitive siloing among scientists, which contribute to the high probability of incorrect findings in published research. The article proposes a series of practical and evidence-based actions to enhance research efficiency and robustness, such as protecting against cognitive biases through blinding, improving methodological training, implementing independent methodological support, encouraging collaboration and team science, promoting study pre-registration, improving reporting quality, and diversifying peer review. The authors emphasize the importance of transparency and open science, advocating for the sharing of data, methods, and results to increase accountability, longevity, efficiency, and quality. They also discuss the role of incentives in shaping research practices and suggest that shifting incentives can encourage more rigorous and reproducible research. The article concludes by emphasizing the ongoing nature of the scientific process and the need for continuous self-examination and improvement to foster a robust metascience that evaluates and improves research practices.The article "A manifesto for reproducible science" by Munafò et al. addresses the need to improve the reliability and efficiency of scientific research. The authors argue for the adoption of measures to optimize key elements of the scientific process, including methods, reporting and dissemination, reproducibility, evaluation, and incentives. They highlight the prevalence of issues such as low sample size, small effect sizes, data dredging, conflicts of interest, and competitive siloing among scientists, which contribute to the high probability of incorrect findings in published research. The article proposes a series of practical and evidence-based actions to enhance research efficiency and robustness, such as protecting against cognitive biases through blinding, improving methodological training, implementing independent methodological support, encouraging collaboration and team science, promoting study pre-registration, improving reporting quality, and diversifying peer review. The authors emphasize the importance of transparency and open science, advocating for the sharing of data, methods, and results to increase accountability, longevity, efficiency, and quality. They also discuss the role of incentives in shaping research practices and suggest that shifting incentives can encourage more rigorous and reproducible research. The article concludes by emphasizing the ongoing nature of the scientific process and the need for continuous self-examination and improvement to foster a robust metascience that evaluates and improves research practices.