Journal article 260 views
Speakers Raise their Hands and Head during Self-Repairs in Dyadic Conversations
IEEE Transactions on Cognitive and Developmental Systems, Pages: 1 - 1
Swansea University Author: Julian Hough
Full text not available from this repository: check for access using links below.
DOI (Published version): 10.1109/tcds.2023.3254808
Abstract
People often encounter difficulties in building shared understanding during everyday conversation. The most common symptom of these difficulties are self-repairs, when a speaker restarts, edits or amends their utterances mid-turn. Previous work has focused on the verbal signals of self-repair, i.e....
Published in: | IEEE Transactions on Cognitive and Developmental Systems |
---|---|
ISSN: | 2379-8920 2379-8939 |
Published: |
Institute of Electrical and Electronics Engineers (IEEE)
2023
|
Online Access: |
Check full text
|
URI: | https://cronfa.swan.ac.uk/Record/cronfa64930 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Abstract: |
People often encounter difficulties in building shared understanding during everyday conversation. The most common symptom of these difficulties are self-repairs, when a speaker restarts, edits or amends their utterances mid-turn. Previous work has focused on the verbal signals of self-repair, i.e. speech disfluences (filled pauses, truncated words and phrases, word substitutions or reformulations), and computational tools now exist that can automatically detect these verbal phenomena. However, face-to-face conversation also exploits rich non-verbal resources and previous research suggests that self-repairs are associated with distinct hand movement patterns. This paper extends those results by exploring head and hand movements of both speakers and listeners using two motion parameters: height (vertical position) and 3D velocity. The results show that speech sequences containing self-repairs are distinguishable from fluent ones: speakers raise their hands and head more (and move more rapidly) during self-repairs. We obtain these results by analysing data from a corpus of 13 unscripted dialogues, and we discuss how these findings could support the creation of improved cognitive artificial systems for natural human-machine and human-robot interaction. |
---|---|
Keywords: |
Maintenance engineering, Oral communication, Natural language processing, Task analysis, Magnetic heads, Speech recognition, Human-robot interaction |
College: |
Faculty of Science and Engineering |
Funders: |
10.13039/100009149-School of Electronic Engineering and Computer Science (Grant Number: EP/L01632X/1). 10.13039/501100000266-Engineering and Physical Sciences Research Council (Grant Number: EP/R02572X/1 and EP/S00453X/1). |
Start Page: |
1 |
End Page: |
1 |