No Cover Image

Journal article 52 views 14 downloads

Reproducibility in Research: Systems, Infrastructure, Culture / Tom Crick; Benjamin A. Hall; Samin Ishtiaq

Journal of Open Research Software, Volume: 5

Swansea University Author: Crick, Tom

  • 73-2363-1-PB.pdf

    PDF | Version of Record

    This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC-BY 4.0).

    Download (631.24KB)

Check full text

DOI (Published version): 10.5334/jors.73

Abstract

The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit) novel algorithms and m...

Full description

Published in: Journal of Open Research Software
ISSN: 2049-9647
Published: Ubiquity Press 2017
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa43573
Tags: Add Tag
No Tags, Be the first to tag this record!
first_indexed 2018-08-27T19:44:40Z
last_indexed 2018-10-26T13:14:32Z
id cronfa43573
recordtype SURis
fullrecord <?xml version="1.0"?><rfc1807><datestamp>2018-10-26T09:19:57Z</datestamp><bib-version>v2</bib-version><id>43573</id><entry>2018-08-27</entry><title>Reproducibility in Research: Systems, Infrastructure, Culture</title><alternativeTitle></alternativeTitle><author>Tom Crick</author><firstname>Tom</firstname><surname>Crick</surname><active>true</active><ORCID>0000-0001-5196-9389</ORCID><ethesisStudent>false</ethesisStudent><sid>200c66ef0fc55391f736f6e926fb4b99</sid><email>9971fd6d74987b78a0d7fce128f8c721</email><emailaddr>z93Ri4T5hwMLTfh+6XG11n2HZhUyFASdV1DFdgIIhKs=</emailaddr><date>2018-08-27</date><deptcode>EDUC</deptcode><abstract>The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit) novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results.In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency &#x2013; and thus reproducibility &#x2013; of scientific exploration.</abstract><type>Journal article</type><journal>Journal of Open Research Software</journal><volume>5</volume><journalNumber/><paginationStart/><paginationEnd/><publisher>Ubiquity Press</publisher><placeOfPublication/><isbnPrint/><isbnElectronic/><issnPrint/><issnElectronic>2049-9647</issnElectronic><keywords>reproducible research, cyberinfrastructure, scientific workflows, computational science, open science, data sharing, code sharing, best practices</keywords><publishedDay>9</publishedDay><publishedMonth>11</publishedMonth><publishedYear>2017</publishedYear><publishedDate>2017-11-09</publishedDate><doi>10.5334/jors.73</doi><url>https://openresearchsoftware.metajnl.com/articles/10.5334/jors.73/</url><notes></notes><college>College of Arts and Humanities</college><department>School of Education</department><CollegeCode>CAAH</CollegeCode><DepartmentCode>EDUC</DepartmentCode><institution/><researchGroup>None</researchGroup><supervisor/><sponsorsfunders/><grantnumber/><degreelevel/><degreename>None</degreename><lastEdited>2018-10-26T09:19:57Z</lastEdited><Created>2018-08-27T13:43:20Z</Created><path><level id="1">College of Science</level><level id="2">Computer Science</level></path><authors><author><firstname>Tom</firstname><surname>Crick</surname><orcid/><order>1</order></author><author><firstname>Benjamin A.</firstname><surname>Hall</surname><orcid/><order>2</order></author><author><firstname>Samin</firstname><surname>Ishtiaq</surname><orcid/><order>3</order></author></authors><documents><document><filename>0043573-27082018134408.pdf</filename><originalFilename>73-2363-1-PB.pdf</originalFilename><uploaded>2018-08-27T13:44:08Z</uploaded><type>Output</type><contentLength>1337009</contentLength><contentType>application/pdf</contentType><version>VoR</version><cronfaStatus>true</cronfaStatus><action>Updated Copyright</action><actionDate>21/09/2018</actionDate><embargoDate>2018-08-27T00:00:00</embargoDate><documentNotes>This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC-BY 4.0).</documentNotes><copyrightCorrect>true</copyrightCorrect><language>eng</language></document></documents></rfc1807>
spelling 2018-10-26T09:19:57Z v2 43573 2018-08-27 Reproducibility in Research: Systems, Infrastructure, Culture Tom Crick Tom Crick true 0000-0001-5196-9389 false 200c66ef0fc55391f736f6e926fb4b99 9971fd6d74987b78a0d7fce128f8c721 z93Ri4T5hwMLTfh+6XG11n2HZhUyFASdV1DFdgIIhKs= 2018-08-27 EDUC The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit) novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results.In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration. Journal article Journal of Open Research Software 5 Ubiquity Press 2049-9647 reproducible research, cyberinfrastructure, scientific workflows, computational science, open science, data sharing, code sharing, best practices 9 11 2017 2017-11-09 10.5334/jors.73 https://openresearchsoftware.metajnl.com/articles/10.5334/jors.73/ College of Arts and Humanities School of Education CAAH EDUC None None 2018-10-26T09:19:57Z 2018-08-27T13:43:20Z College of Science Computer Science Tom Crick 1 Benjamin A. Hall 2 Samin Ishtiaq 3 0043573-27082018134408.pdf 73-2363-1-PB.pdf 2018-08-27T13:44:08Z Output 1337009 application/pdf VoR true Updated Copyright 21/09/2018 2018-08-27T00:00:00 This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC-BY 4.0). true eng
title Reproducibility in Research: Systems, Infrastructure, Culture
spellingShingle Reproducibility in Research: Systems, Infrastructure, Culture
Crick, Tom
title_short Reproducibility in Research: Systems, Infrastructure, Culture
title_full Reproducibility in Research: Systems, Infrastructure, Culture
title_fullStr Reproducibility in Research: Systems, Infrastructure, Culture
title_full_unstemmed Reproducibility in Research: Systems, Infrastructure, Culture
title_sort Reproducibility in Research: Systems, Infrastructure, Culture
author_id_str_mv 200c66ef0fc55391f736f6e926fb4b99
author_id_fullname_str_mv 200c66ef0fc55391f736f6e926fb4b99_***_Crick, Tom
author Crick, Tom
author2 Tom Crick
Benjamin A. Hall
Samin Ishtiaq
format Journal article
container_title Journal of Open Research Software
container_volume 5
publishDate 2017
institution Swansea University
issn 2049-9647
doi_str_mv 10.5334/jors.73
publisher Ubiquity Press
college_str College of Science
hierarchytype
hierarchy_top_id collegeofscience
hierarchy_top_title College of Science
hierarchy_parent_id collegeofscience
hierarchy_parent_title College of Science
department_str Computer Science{{{_:::_}}}College of Science{{{_:::_}}}Computer Science
url https://openresearchsoftware.metajnl.com/articles/10.5334/jors.73/
document_store_str 1
active_str 1
description The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit) novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results.In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.
published_date 2017-11-09T06:10:00Z
_version_ 1642723698996674560
score 10.837401