Journal article 41 views 17 downloads
Design and evaluation of a time adaptive multimodal virtual keyboard / Yogesh Kumar Meena; Hubert Cecotti; KongFatt Wong-Lin; Girijesh Prasad
Journal on Multimodal User Interfaces
Swansea University Author: Yogesh Kumar, Meena
PDF | Version of Record
Released under the terms of a Creative Commons Attribution 4.0 International License (CC-BY).Download (1.41MB)
The usability of virtual keyboard based eyetyping systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode...
|Published in:||Journal on Multimodal User Interfaces|
Check full text
No Tags, Be the first to tag this record!
The usability of virtual keyboard based eyetyping systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode and trial period in synchronous mode for gaze based virtual keyboards. The rules take into account commands that allow corrections in the application, and it has been tested on a newly developed virtual keyboard for a structurally complex language by using a two-stage tree-based character selection arrangement.We propose several dwell-based and dwell-free mechanisms with the multimodal access facility wherein the search of a target item is achieved through gaze detection and the selection can happen via the use of a dwell time, softswitch, or gesture detection using surface electromyography (sEMG) in asynchronous mode; while in the synchronous mode, both the search and selection may be performed with just the eye-tracker. The system performance is evaluated in terms of text entry rate and information transfer rate with 20 different experimental conditions. The proposed strategy for adapting theparameters over time has shown a signicant improvement (more than 40%) over non-adaptive approaches for new users. The multimodal dwell-free mechanismusing a combination of eye-tracking and soft-switch provides better performance than adaptive methods with eye-tracking only. The overall system receives an excellentgrade on adjective rating scale using the system usability scale and a low weighted rating on the NASA task load index, demonstrating the user-centered focus of the system.
Gaze-based access control; Adaptive control; Multimodal dwell-free control; Graphical user interface; Virtual keyboard; Eye-typing
College of Science