Journal article 401 views 54 downloads
Reproducibility in Research: Systems, Infrastructure, Culture
Journal of Open Research Software, Volume: 5
Swansea University Author: Tom Crick
PDF | Version of Record
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC-BY 4.0).Download (631.24KB)
The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit) novel algorithms and m...
|Published in:||Journal of Open Research Software|
Check full text
No Tags, Be the first to tag this record!
The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit) novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results.In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.
reproducible research, cyberinfrastructure, scientific workflows, computational science, open science, data sharing, code sharing, best practices
College of Science