No Cover Image

Journal article 22970 views 102 downloads

Reproducibility in Research: Systems, Infrastructure, Culture

Tom Crick Orcid Logo, Benjamin A. Hall, Samin Ishtiaq

Journal of Open Research Software, Volume: 5

Swansea University Author: Tom Crick Orcid Logo

  • 73-2363-1-PB.pdf

    PDF | Version of Record

    This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC-BY 4.0).

    Download (631.24KB)

Check full text

DOI (Published version): 10.5334/jors.73

Abstract

The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit) novel algorithms and m...

Full description

Published in: Journal of Open Research Software
ISSN: 2049-9647
Published: Ubiquity Press 2017
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa43573
Tags: Add Tag
No Tags, Be the first to tag this record!
first_indexed 2018-08-27T19:44:40Z
last_indexed 2023-01-11T14:20:14Z
id cronfa43573
recordtype SURis
fullrecord <?xml version="1.0"?><rfc1807><datestamp>2022-12-18T17:32:26.9117553</datestamp><bib-version>v2</bib-version><id>43573</id><entry>2018-08-27</entry><title>Reproducibility in Research: Systems, Infrastructure, Culture</title><swanseaauthors><author><sid>200c66ef0fc55391f736f6e926fb4b99</sid><ORCID>0000-0001-5196-9389</ORCID><firstname>Tom</firstname><surname>Crick</surname><name>Tom Crick</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2018-08-27</date><deptcode>EDUC</deptcode><abstract>The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit) novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results.In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency &#x2013; and thus reproducibility &#x2013; of scientific exploration.</abstract><type>Journal Article</type><journal>Journal of Open Research Software</journal><volume>5</volume><journalNumber/><paginationStart/><paginationEnd/><publisher>Ubiquity Press</publisher><placeOfPublication/><isbnPrint/><isbnElectronic/><issnPrint/><issnElectronic>2049-9647</issnElectronic><keywords>reproducible research, cyberinfrastructure, scientific workflows, computational science, open science, data sharing, code sharing, best practices</keywords><publishedDay>9</publishedDay><publishedMonth>11</publishedMonth><publishedYear>2017</publishedYear><publishedDate>2017-11-09</publishedDate><doi>10.5334/jors.73</doi><url>https://openresearchsoftware.metajnl.com/articles/10.5334/jors.73/</url><notes/><college>COLLEGE NANME</college><department>Education</department><CollegeCode>COLLEGE CODE</CollegeCode><DepartmentCode>EDUC</DepartmentCode><institution>Swansea University</institution><apcterm/><funders/><projectreference/><lastEdited>2022-12-18T17:32:26.9117553</lastEdited><Created>2018-08-27T13:43:20.8007775</Created><path><level id="1">Faculty of Humanities and Social Sciences</level><level id="2">School of Social Sciences - Education and Childhood Studies</level></path><authors><author><firstname>Tom</firstname><surname>Crick</surname><orcid>0000-0001-5196-9389</orcid><order>1</order></author><author><firstname>Benjamin A.</firstname><surname>Hall</surname><order>2</order></author><author><firstname>Samin</firstname><surname>Ishtiaq</surname><order>3</order></author></authors><documents><document><filename>0043573-27082018134408.pdf</filename><originalFilename>73-2363-1-PB.pdf</originalFilename><uploaded>2018-08-27T13:44:08.8030000</uploaded><type>Output</type><contentLength>1337009</contentLength><contentType>application/pdf</contentType><version>Version of Record</version><cronfaStatus>true</cronfaStatus><embargoDate>2018-08-27T00:00:00.0000000</embargoDate><documentNotes>This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC-BY 4.0).</documentNotes><copyrightCorrect>true</copyrightCorrect><language>eng</language></document></documents><OutputDurs/></rfc1807>
spelling 2022-12-18T17:32:26.9117553 v2 43573 2018-08-27 Reproducibility in Research: Systems, Infrastructure, Culture 200c66ef0fc55391f736f6e926fb4b99 0000-0001-5196-9389 Tom Crick Tom Crick true false 2018-08-27 EDUC The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit) novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results.In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration. Journal Article Journal of Open Research Software 5 Ubiquity Press 2049-9647 reproducible research, cyberinfrastructure, scientific workflows, computational science, open science, data sharing, code sharing, best practices 9 11 2017 2017-11-09 10.5334/jors.73 https://openresearchsoftware.metajnl.com/articles/10.5334/jors.73/ COLLEGE NANME Education COLLEGE CODE EDUC Swansea University 2022-12-18T17:32:26.9117553 2018-08-27T13:43:20.8007775 Faculty of Humanities and Social Sciences School of Social Sciences - Education and Childhood Studies Tom Crick 0000-0001-5196-9389 1 Benjamin A. Hall 2 Samin Ishtiaq 3 0043573-27082018134408.pdf 73-2363-1-PB.pdf 2018-08-27T13:44:08.8030000 Output 1337009 application/pdf Version of Record true 2018-08-27T00:00:00.0000000 This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC-BY 4.0). true eng
title Reproducibility in Research: Systems, Infrastructure, Culture
spellingShingle Reproducibility in Research: Systems, Infrastructure, Culture
Tom Crick
title_short Reproducibility in Research: Systems, Infrastructure, Culture
title_full Reproducibility in Research: Systems, Infrastructure, Culture
title_fullStr Reproducibility in Research: Systems, Infrastructure, Culture
title_full_unstemmed Reproducibility in Research: Systems, Infrastructure, Culture
title_sort Reproducibility in Research: Systems, Infrastructure, Culture
author_id_str_mv 200c66ef0fc55391f736f6e926fb4b99
author_id_fullname_str_mv 200c66ef0fc55391f736f6e926fb4b99_***_Tom Crick
author Tom Crick
author2 Tom Crick
Benjamin A. Hall
Samin Ishtiaq
format Journal article
container_title Journal of Open Research Software
container_volume 5
publishDate 2017
institution Swansea University
issn 2049-9647
doi_str_mv 10.5334/jors.73
publisher Ubiquity Press
college_str Faculty of Humanities and Social Sciences
hierarchytype
hierarchy_top_id facultyofhumanitiesandsocialsciences
hierarchy_top_title Faculty of Humanities and Social Sciences
hierarchy_parent_id facultyofhumanitiesandsocialsciences
hierarchy_parent_title Faculty of Humanities and Social Sciences
department_str School of Social Sciences - Education and Childhood Studies{{{_:::_}}}Faculty of Humanities and Social Sciences{{{_:::_}}}School of Social Sciences - Education and Childhood Studies
url https://openresearchsoftware.metajnl.com/articles/10.5334/jors.73/
document_store_str 1
active_str 0
description The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit) novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results.In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.
published_date 2017-11-09T03:54:49Z
_version_ 1763752740884840448
score 10.993443