No Cover Image

Journal article 638 views 184 downloads

Design and evaluation of a time adaptive multimodal virtual keyboard

Yogesh Kumar Meena, Hubert Cecotti, KongFatt Wong-Lin, Girijesh Prasad

Journal on Multimodal User Interfaces

Swansea University Author: Yogesh Kumar Meena

  • 51091.pdf

    PDF | Version of Record

    Released under the terms of a Creative Commons Attribution 4.0 International License (CC-BY).

    Download (1.41MB)

Abstract

The usability of virtual keyboard based eyetyping systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode...

Full description

Published in: Journal on Multimodal User Interfaces
ISSN: 1783-7677 1783-8738
Published: Springer 2019
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa51091
first_indexed 2019-07-15T15:34:58Z
last_indexed 2019-07-16T21:35:29Z
id cronfa51091
recordtype SURis
fullrecord <?xml version="1.0"?><rfc1807><datestamp>2019-07-16T14:36:09.4621366</datestamp><bib-version>v2</bib-version><id>51091</id><entry>2019-07-15</entry><title>Design and evaluation of a time adaptive multimodal virtual keyboard</title><swanseaauthors><author><sid>99fa72c8a55321a225c0a5abf0955585</sid><firstname>Yogesh Kumar</firstname><surname>Meena</surname><name>Yogesh Kumar Meena</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2019-07-15</date><deptcode>MACS</deptcode><abstract>The usability of virtual keyboard based eyetyping systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode and trial period in synchronous mode for gaze based virtual keyboards. The rules take into account commands that allow corrections in the application, and it has been tested on a newly developed virtual keyboard for a structurally complex language by using a two-stage tree-based character selection arrangement.We propose several dwell-based and dwell-free mechanisms with the multimodal access facility wherein the search of a target item is achieved through gaze detection and the selection can happen via the use of a dwell time, softswitch, or gesture detection using surface electromyography (sEMG) in asynchronous mode; while in the synchronous mode, both the search and selection may be performed with just the eye-tracker. The system performance is evaluated in terms of text entry rate and information transfer rate with 20 different experimental conditions. The proposed strategy for adapting theparameters over time has shown a signicant improvement (more than 40%) over non-adaptive approaches for new users. The multimodal dwell-free mechanismusing a combination of eye-tracking and soft-switch provides better performance than adaptive methods with eye-tracking only. The overall system receives an excellentgrade on adjective rating scale using the system usability scale and a low weighted rating on the NASA task load index, demonstrating the user-centered focus of the system.</abstract><type>Journal Article</type><journal>Journal on Multimodal User Interfaces</journal><publisher>Springer</publisher><issnPrint>1783-7677</issnPrint><issnElectronic>1783-8738</issnElectronic><keywords>Gaze-based access control; Adaptive control; Multimodal dwell-free control; Graphical user interface; Virtual keyboard; Eye-typing</keywords><publishedDay>31</publishedDay><publishedMonth>12</publishedMonth><publishedYear>2019</publishedYear><publishedDate>2019-12-31</publishedDate><doi>10.1007/s12193-019-00293-z</doi><url/><notes/><college>COLLEGE NANME</college><department>Mathematics and Computer Science School</department><CollegeCode>COLLEGE CODE</CollegeCode><DepartmentCode>MACS</DepartmentCode><institution>Swansea University</institution><apcterm/><lastEdited>2019-07-16T14:36:09.4621366</lastEdited><Created>2019-07-15T12:59:53.0146725</Created><path><level id="1">Faculty of Science and Engineering</level><level id="2">School of Mathematics and Computer Science - Computer Science</level></path><authors><author><firstname>Yogesh Kumar</firstname><surname>Meena</surname><order>1</order></author><author><firstname>Hubert</firstname><surname>Cecotti</surname><order>2</order></author><author><firstname>KongFatt</firstname><surname>Wong-Lin</surname><order>3</order></author><author><firstname>Girijesh</firstname><surname>Prasad</surname><order>4</order></author></authors><documents><document><filename>0051091-16072019143457.pdf</filename><originalFilename>51091.pdf</originalFilename><uploaded>2019-07-16T14:34:57.0830000</uploaded><type>Output</type><contentLength>1557573</contentLength><contentType>application/pdf</contentType><version>Version of Record</version><cronfaStatus>true</cronfaStatus><embargoDate>2019-07-15T00:00:00.0000000</embargoDate><documentNotes>Released under the terms of a Creative Commons Attribution 4.0 International License (CC-BY).</documentNotes><copyrightCorrect>true</copyrightCorrect><language>eng</language></document></documents><OutputDurs/></rfc1807>
spelling 2019-07-16T14:36:09.4621366 v2 51091 2019-07-15 Design and evaluation of a time adaptive multimodal virtual keyboard 99fa72c8a55321a225c0a5abf0955585 Yogesh Kumar Meena Yogesh Kumar Meena true false 2019-07-15 MACS The usability of virtual keyboard based eyetyping systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode and trial period in synchronous mode for gaze based virtual keyboards. The rules take into account commands that allow corrections in the application, and it has been tested on a newly developed virtual keyboard for a structurally complex language by using a two-stage tree-based character selection arrangement.We propose several dwell-based and dwell-free mechanisms with the multimodal access facility wherein the search of a target item is achieved through gaze detection and the selection can happen via the use of a dwell time, softswitch, or gesture detection using surface electromyography (sEMG) in asynchronous mode; while in the synchronous mode, both the search and selection may be performed with just the eye-tracker. The system performance is evaluated in terms of text entry rate and information transfer rate with 20 different experimental conditions. The proposed strategy for adapting theparameters over time has shown a signicant improvement (more than 40%) over non-adaptive approaches for new users. The multimodal dwell-free mechanismusing a combination of eye-tracking and soft-switch provides better performance than adaptive methods with eye-tracking only. The overall system receives an excellentgrade on adjective rating scale using the system usability scale and a low weighted rating on the NASA task load index, demonstrating the user-centered focus of the system. Journal Article Journal on Multimodal User Interfaces Springer 1783-7677 1783-8738 Gaze-based access control; Adaptive control; Multimodal dwell-free control; Graphical user interface; Virtual keyboard; Eye-typing 31 12 2019 2019-12-31 10.1007/s12193-019-00293-z COLLEGE NANME Mathematics and Computer Science School COLLEGE CODE MACS Swansea University 2019-07-16T14:36:09.4621366 2019-07-15T12:59:53.0146725 Faculty of Science and Engineering School of Mathematics and Computer Science - Computer Science Yogesh Kumar Meena 1 Hubert Cecotti 2 KongFatt Wong-Lin 3 Girijesh Prasad 4 0051091-16072019143457.pdf 51091.pdf 2019-07-16T14:34:57.0830000 Output 1557573 application/pdf Version of Record true 2019-07-15T00:00:00.0000000 Released under the terms of a Creative Commons Attribution 4.0 International License (CC-BY). true eng
title Design and evaluation of a time adaptive multimodal virtual keyboard
spellingShingle Design and evaluation of a time adaptive multimodal virtual keyboard
Yogesh Kumar Meena
title_short Design and evaluation of a time adaptive multimodal virtual keyboard
title_full Design and evaluation of a time adaptive multimodal virtual keyboard
title_fullStr Design and evaluation of a time adaptive multimodal virtual keyboard
title_full_unstemmed Design and evaluation of a time adaptive multimodal virtual keyboard
title_sort Design and evaluation of a time adaptive multimodal virtual keyboard
author_id_str_mv 99fa72c8a55321a225c0a5abf0955585
author_id_fullname_str_mv 99fa72c8a55321a225c0a5abf0955585_***_Yogesh Kumar Meena
author Yogesh Kumar Meena
author2 Yogesh Kumar Meena
Hubert Cecotti
KongFatt Wong-Lin
Girijesh Prasad
format Journal article
container_title Journal on Multimodal User Interfaces
publishDate 2019
institution Swansea University
issn 1783-7677
1783-8738
doi_str_mv 10.1007/s12193-019-00293-z
publisher Springer
college_str Faculty of Science and Engineering
hierarchytype
hierarchy_top_id facultyofscienceandengineering
hierarchy_top_title Faculty of Science and Engineering
hierarchy_parent_id facultyofscienceandengineering
hierarchy_parent_title Faculty of Science and Engineering
department_str School of Mathematics and Computer Science - Computer Science{{{_:::_}}}Faculty of Science and Engineering{{{_:::_}}}School of Mathematics and Computer Science - Computer Science
document_store_str 1
active_str 0
description The usability of virtual keyboard based eyetyping systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode and trial period in synchronous mode for gaze based virtual keyboards. The rules take into account commands that allow corrections in the application, and it has been tested on a newly developed virtual keyboard for a structurally complex language by using a two-stage tree-based character selection arrangement.We propose several dwell-based and dwell-free mechanisms with the multimodal access facility wherein the search of a target item is achieved through gaze detection and the selection can happen via the use of a dwell time, softswitch, or gesture detection using surface electromyography (sEMG) in asynchronous mode; while in the synchronous mode, both the search and selection may be performed with just the eye-tracker. The system performance is evaluated in terms of text entry rate and information transfer rate with 20 different experimental conditions. The proposed strategy for adapting theparameters over time has shown a signicant improvement (more than 40%) over non-adaptive approaches for new users. The multimodal dwell-free mechanismusing a combination of eye-tracking and soft-switch provides better performance than adaptive methods with eye-tracking only. The overall system receives an excellentgrade on adjective rating scale using the system usability scale and a low weighted rating on the NASA task load index, demonstrating the user-centered focus of the system.
published_date 2019-12-31T04:39:30Z
_version_ 1851366593394638848
score 11.089572