No Cover Image

Journal article 87 views 15 downloads

Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines

Laura Roberts, Joanne Berry Orcid Logo

Journal of Learning Development in Higher Education, Volume: 28, Issue: 28

Swansea University Authors: Laura Roberts, Joanne Berry Orcid Logo

  • 65555.VOR.pdf

    PDF | Version of Record

    This work is licensed under a Creative Commons Attribution 4.0 International License.

    Download (833.33KB)

Abstract

The mass shift to Open-Book, Open-Web (OBOW) assessments during the pandemic highlighted new opportunities in Higher Education for developing accessible, authentic assessments that can reduce administrative load. Despite a plethora of research emerging on the effectiveness of OBOW assessments within...

Full description

Published in: Journal of Learning Development in Higher Education
ISSN: 1759-667X
Published: Online Association for Learning Development in Higher Education 2023
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa65555
Tags: Add Tag
No Tags, Be the first to tag this record!
first_indexed 2024-04-07T12:55:23Z
last_indexed 2024-04-07T12:55:23Z
id cronfa65555
recordtype SURis
fullrecord <?xml version="1.0" encoding="utf-8"?><rfc1807 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><bib-version>v2</bib-version><id>65555</id><entry>2024-02-01</entry><title>Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines</title><swanseaauthors><author><sid>87c08dc851268e349a71f8511bef47d2</sid><firstname>Laura</firstname><surname>Roberts</surname><name>Laura Roberts</name><active>true</active><ethesisStudent>false</ethesisStudent></author><author><sid>d844420dcb4e868edd68414f808f4259</sid><ORCID>0000-0002-8212-8440</ORCID><firstname>Joanne</firstname><surname>Berry</surname><name>Joanne Berry</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2024-02-01</date><deptcode>SBI</deptcode><abstract>The mass shift to Open-Book, Open-Web (OBOW) assessments during the pandemic highlighted new opportunities in Higher Education for developing accessible, authentic assessments that can reduce administrative load. Despite a plethora of research emerging on the effectiveness of OBOW assessments within disciplines, few currently evaluate their effectiveness across disciplines where the assessment instrument can vary significantly. This paper aims to evaluate the experience students across STEM subjects had of OBOW exams to contribute to an evidence-base for emerging post-pandemic assessment policies and strategies. In April 2021, following two cycles of OBOW exams, we surveyed STEM students across a range of subjects to determine their preparation strategy, experiences during the exam, perception of development of higher order cognitive skills, test anxiety, and how they thought these assessments might enhance employability. Overall, students from subjects that use assessment instruments requiring analytical, quantitative-based answers (Maths, Physics, Computer Science and Chemistry) adapted their existing study skills less effectively, felt less prepared and experienced higher levels of stress compared to students of subjects using more qualitative discursive based answers (Biosciences and Geography). We conclude with recommendations on how to enhance the use of OBOW exams: these include supporting and developing more effective study skills, ensuring assessments align with intended learning outcomes, addressing the issue of academic integrity, promoting inclusivity, and encouraging authentic assessment. Based on the outcomes of this study, we strongly advise that assessment policies that foster the whole-scale roll-out of OBOW assessment consider the inter-disciplinary impacts on learner development, staff training and workload resources.</abstract><type>Journal Article</type><journal>Journal of Learning Development in Higher Education</journal><volume>28</volume><journalNumber>28</journalNumber><paginationStart/><paginationEnd/><publisher>Association for Learning Development in Higher Education</publisher><placeOfPublication>Online</placeOfPublication><isbnPrint/><isbnElectronic/><issnPrint/><issnElectronic>1759-667X</issnElectronic><keywords>open-book exams; online assessments; STEM; closed-book exams.</keywords><publishedDay>24</publishedDay><publishedMonth>9</publishedMonth><publishedYear>2023</publishedYear><publishedDate>2023-09-24</publishedDate><doi>10.47408/jldhe.vi28.1030</doi><url/><notes/><college>COLLEGE NANME</college><department>Biosciences</department><CollegeCode>COLLEGE CODE</CollegeCode><DepartmentCode>SBI</DepartmentCode><institution>Swansea University</institution><apcterm>Not Required</apcterm><funders/><projectreference/><lastEdited>2024-04-07T13:57:21.1144971</lastEdited><Created>2024-02-01T11:28:43.6809437</Created><path><level id="1">Faculty of Humanities and Social Sciences</level><level id="2">School of Culture and Communication - Classics, Ancient History, Egyptology</level></path><authors><author><firstname>Laura</firstname><surname>Roberts</surname><order>1</order></author><author><firstname>Joanne</firstname><surname>Berry</surname><orcid>0000-0002-8212-8440</orcid><order>2</order></author></authors><documents><document><filename>65555__29938__f34f71e1b8c540b4be1023516824a38b.pdf</filename><originalFilename>65555.VOR.pdf</originalFilename><uploaded>2024-04-07T13:56:07.4916437</uploaded><type>Output</type><contentLength>853329</contentLength><contentType>application/pdf</contentType><version>Version of Record</version><cronfaStatus>true</cronfaStatus><documentNotes>This work is licensed under a Creative Commons Attribution 4.0 International License.</documentNotes><copyrightCorrect>true</copyrightCorrect><language>eng</language><licence>https://creativecommons.org/licenses/by/4.0/</licence></document></documents><OutputDurs/></rfc1807>
spelling v2 65555 2024-02-01 Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines 87c08dc851268e349a71f8511bef47d2 Laura Roberts Laura Roberts true false d844420dcb4e868edd68414f808f4259 0000-0002-8212-8440 Joanne Berry Joanne Berry true false 2024-02-01 SBI The mass shift to Open-Book, Open-Web (OBOW) assessments during the pandemic highlighted new opportunities in Higher Education for developing accessible, authentic assessments that can reduce administrative load. Despite a plethora of research emerging on the effectiveness of OBOW assessments within disciplines, few currently evaluate their effectiveness across disciplines where the assessment instrument can vary significantly. This paper aims to evaluate the experience students across STEM subjects had of OBOW exams to contribute to an evidence-base for emerging post-pandemic assessment policies and strategies. In April 2021, following two cycles of OBOW exams, we surveyed STEM students across a range of subjects to determine their preparation strategy, experiences during the exam, perception of development of higher order cognitive skills, test anxiety, and how they thought these assessments might enhance employability. Overall, students from subjects that use assessment instruments requiring analytical, quantitative-based answers (Maths, Physics, Computer Science and Chemistry) adapted their existing study skills less effectively, felt less prepared and experienced higher levels of stress compared to students of subjects using more qualitative discursive based answers (Biosciences and Geography). We conclude with recommendations on how to enhance the use of OBOW exams: these include supporting and developing more effective study skills, ensuring assessments align with intended learning outcomes, addressing the issue of academic integrity, promoting inclusivity, and encouraging authentic assessment. Based on the outcomes of this study, we strongly advise that assessment policies that foster the whole-scale roll-out of OBOW assessment consider the inter-disciplinary impacts on learner development, staff training and workload resources. Journal Article Journal of Learning Development in Higher Education 28 28 Association for Learning Development in Higher Education Online 1759-667X open-book exams; online assessments; STEM; closed-book exams. 24 9 2023 2023-09-24 10.47408/jldhe.vi28.1030 COLLEGE NANME Biosciences COLLEGE CODE SBI Swansea University Not Required 2024-04-07T13:57:21.1144971 2024-02-01T11:28:43.6809437 Faculty of Humanities and Social Sciences School of Culture and Communication - Classics, Ancient History, Egyptology Laura Roberts 1 Joanne Berry 0000-0002-8212-8440 2 65555__29938__f34f71e1b8c540b4be1023516824a38b.pdf 65555.VOR.pdf 2024-04-07T13:56:07.4916437 Output 853329 application/pdf Version of Record true This work is licensed under a Creative Commons Attribution 4.0 International License. true eng https://creativecommons.org/licenses/by/4.0/
title Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines
spellingShingle Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines
Laura Roberts
Joanne Berry
title_short Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines
title_full Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines
title_fullStr Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines
title_full_unstemmed Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines
title_sort Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines
author_id_str_mv 87c08dc851268e349a71f8511bef47d2
d844420dcb4e868edd68414f808f4259
author_id_fullname_str_mv 87c08dc851268e349a71f8511bef47d2_***_Laura Roberts
d844420dcb4e868edd68414f808f4259_***_Joanne Berry
author Laura Roberts
Joanne Berry
author2 Laura Roberts
Joanne Berry
format Journal article
container_title Journal of Learning Development in Higher Education
container_volume 28
container_issue 28
publishDate 2023
institution Swansea University
issn 1759-667X
doi_str_mv 10.47408/jldhe.vi28.1030
publisher Association for Learning Development in Higher Education
college_str Faculty of Humanities and Social Sciences
hierarchytype
hierarchy_top_id facultyofhumanitiesandsocialsciences
hierarchy_top_title Faculty of Humanities and Social Sciences
hierarchy_parent_id facultyofhumanitiesandsocialsciences
hierarchy_parent_title Faculty of Humanities and Social Sciences
department_str School of Culture and Communication - Classics, Ancient History, Egyptology{{{_:::_}}}Faculty of Humanities and Social Sciences{{{_:::_}}}School of Culture and Communication - Classics, Ancient History, Egyptology
document_store_str 1
active_str 0
description The mass shift to Open-Book, Open-Web (OBOW) assessments during the pandemic highlighted new opportunities in Higher Education for developing accessible, authentic assessments that can reduce administrative load. Despite a plethora of research emerging on the effectiveness of OBOW assessments within disciplines, few currently evaluate their effectiveness across disciplines where the assessment instrument can vary significantly. This paper aims to evaluate the experience students across STEM subjects had of OBOW exams to contribute to an evidence-base for emerging post-pandemic assessment policies and strategies. In April 2021, following two cycles of OBOW exams, we surveyed STEM students across a range of subjects to determine their preparation strategy, experiences during the exam, perception of development of higher order cognitive skills, test anxiety, and how they thought these assessments might enhance employability. Overall, students from subjects that use assessment instruments requiring analytical, quantitative-based answers (Maths, Physics, Computer Science and Chemistry) adapted their existing study skills less effectively, felt less prepared and experienced higher levels of stress compared to students of subjects using more qualitative discursive based answers (Biosciences and Geography). We conclude with recommendations on how to enhance the use of OBOW exams: these include supporting and developing more effective study skills, ensuring assessments align with intended learning outcomes, addressing the issue of academic integrity, promoting inclusivity, and encouraging authentic assessment. Based on the outcomes of this study, we strongly advise that assessment policies that foster the whole-scale roll-out of OBOW assessment consider the inter-disciplinary impacts on learner development, staff training and workload resources.
published_date 2023-09-24T13:57:17Z
_version_ 1795680777340452864
score 11.016235