New Report Recommends Greater Transparency in Research

Associate Professor Lorena Barba recently presented findings from the congressionally-mandated report on Capitol Hill.

May 9, 2019

Lorena BarbaBy Kristen Mitchell

Lorena Barba, an associate professor of mechanical and aerospace engineering at the School of Engineering and Applied Science, recently presented a congressionally-mandated report to federal lawmakers and experts that will guide future national policy on science and engineering research.

Dr. Barba is a member of the National Academies of Sciences Engineering and Medicine (NASEM) Reproducibility and Replicability in Science study committee, which was formed in 2017. The committee held a series of briefings leading up to the release of their report on May 7 with the White House Office of Science and Technology Policy, the National Science Foundation (NSF) and other stakeholders.

The committee aimed to identify any issues of replication and reproducibility in scientific and engineering research. The report made recommendations for improving rigor and transparency in scientific and engineering research, and identified and highlighted good practices.

GW Today recently spoke with Dr. Barba about the report:

Q: Why do you think it was important to have a committee study issues surrounding research replication and reproducibility?
A: The study was commissioned by the National Science Foundation, in response to congressional mandate. Public Law 114-329 cites "growing concern that some published research findings cannot be reproduced or replicated…” and directs the NSF to produce a report with an assessment and recommendations on the matter. I don’t know what prompted this legal directive, but I do know that for the past few years, various media reports have publicized prominent failures to replicate findings. At the same time, a movement for open science and reproducibility has been growing across fields of research. It was thus timely to do an in-depth study looking across all of science.

Q: How did you get involved with this committee?
A: I was nominated by anonymous members of the scientific community, as a recognized authority in reproducibility. The National Academies contacted me about the study committee in September 2017, and after a phone interview and sending additional background about my work, I was formally invited in November 2017. We’ve worked on this study for nearly a year and a half.

Q: What are the major findings from the report?
A: The overall finding can be summarized in four words: no crisis, no complacency. The message is twofold: the crisis narrative that has unfolded over the past few years is mostly rhetoric: reproducibility and replicability are part of the ways in which science self-corrects, but are not the sole concern. On the other hand, improvements are needed—more transparency of computational workflows, code and data, for example, and adjusting the incentive structure to value reproducible research. Other findings include the need for greater fluency with statistics, and training of early career researchers on computational tools and methods.

Q: Why did the committee seek to define what the terms replication and reproducibility mean?
A: The lack of a standard usage for these terms hinders progress, as often it’s unclear what researchers mean and thus how to address the concerns. Both reproducibility and replicability are words used in relation to the general concerns of a researcher or a study confirming the findings of a previously published study. Sometimes the words are used as an umbrella term for all related concerns. But some fields have made a distinction between the two terms, and it’s important to agree on what that distinction is to move the conversation forward. The study defines reproducibility as obtaining consistent computational results using the same input data, computational steps, methods, code and conditions of analysis. A replication study, on the other hand, collects new data and conducts new analysis in pursuit of confirming the findings of a previous study.

Q: What should students interested in doing research in the future take away from this report?
A: The report emphasizes the ubiquitous and important role of computing in modern science, and the need for enhanced training in statistical methods. Students interested in research would gain an advantage by seeking training in computational skills—including open-source tools, software development, data management and how to automate repetitive tasks. Conducting research reproducibly becomes less challenging if you use the right tools.

Q: Now that the report is released, what kind of further work do you think needs to be done related to these issues?
A: The report makes a series of recommendations that will involve continued work, by various people and institutions. Education and training in both statistics and computational methods needs some reinforcement. Journals and professional societies need to work on initiatives to promote computational reproducibility and publishing of replications. The report also recommends that funding agencies incorporate reproducibility and replicability into their merit-review criteria for new proposals.