Science paper emphasizes need to share and cite data, code, workflow across fields of research
More than 700 years ago, English philosopher Roger Bacon made the first known effort to define the scientific method when he described an iterative cycle of observation, hypothesis, experimentation, and — perhaps most important — the need for independent verification.
Bacon’s approach has held up over centuries of science, but he couldn’t have foreseen the introduction of a new wrinkle in the fabric of research — the use of computational methods in virtually all fields of scientific scholarship, from weather prediction and earthquake modeling to energy efficiency and genomics.
“The use of computers to process and analyze data and to simulate complex systems has led to major concerns about the reproducibility, replicability, and robustness of computer-enabled results,” says the University of Delaware’s Michela Taufer. “Transparency in the scientific literature about the computational methods used in research is essential to successfully work with the communities using the results.”
A computational experiment has various levels of reproducibility, replicability and robustness, ranging from rigorous artifact description by the original research group to the ability of another group to obtain the same results using independently developed artifacts. Artifacts are all of the elements in a software development project, including documentation, test plans, images, data files and executable modules.
Taufer is co-author on a paper addressing this issue, “Enhancing Reproducibility for Computational Methods,” published online in the journal Science.
The recommendations shared in the paper grew out of a AAAS workshop that included funding agencies, publishers and journal editors, industry participants and researchers representing a broad range of domains.
The group concluded that access to the computational steps taken to process data and generate findings is as important as access to the data itself.
In addition to defining a detailed reproducibility check, the authors suggest that journals can improve review of computational findings by rewarding reviewers who take the time to verify these findings.
Taufer is actively promoting transparency of computer-enabled results in the high-performance computing community. She is part of the advisory team that has been leading the charge for the Association for Computing Machinery (ACM) to introduce a “Good Housekeeping Seal of Approval” type of badging system for computational excellence in their digital library.
Under this system, a red badge would be applied to papers whose associated artifacts have successfully completed an independent audit, a green one to papers whose author-created artifacts have been placed on a publicly accessible archival repository, and a gold one to papers in which the main results have been successfully obtained by researchers other than the authors.
The reproducibility initiative was put to the test in the Student Cluster Competition (SCC) at the recent ACM/IEEE International Conference for High Performance Computing, Networking, Storage and Analysis Conference. SCC challenges teams of undergraduate students to partner with a hardware vendor to build and test a high-end cluster at the conference.
In the past, the only criterion for completion was computational speed, but a reproducibility challenge was added this year, with the students receiving a scientific paper and the associated code before the conference. During the competition, they had to replicate the work for a different dataset, analyze the impact of a different architecture, assess the impact of downscaling, write a report organized according to a provided structure, and discuss the experience in an exit interview.
“While our Science paper focuses on computer science concepts like the sharing of data, code, and workflow and the licensing of digital assets, we believe that its impact goes far beyond our field,” Taufer says. “The size and scale of modern high-end clusters and supercomputers have introduced additional complexity to what was already an important issue in how we conduct research and disseminate new knowledge.”
About the paper
The paper emerged from the Reproducibility Session at the AAAS Forum on Science and Technology Policy, held in Washington, D.C., on May 1-2, 2014.
The paper was co-authored by Victoria Stodden (University of Illinois, Urbana Champaign), Marcia McNutt (National Academy of Sciences), David H. Bailey (University of California, Davis), Ewa Deelman (University of Southern California), Yolanda Gil (University of Southern California), Brooks Hanson (American Geophysical Union), Michael Heroux (Sandia National Laboratories), John P.A. Ionannidis (Stanford University), and Michela Taufer (University of Delaware).
Taufer is an associate professor in the Department of Computer and Information Sciences.