Journal article 330 views 18 downloads
Spatiotemporal dynamics in human visual cortex rapidly encode the emotional content of faces
Human Brain Mapping, Volume: 39, Issue: 10, Pages: 3993 - 4006
Swansea University Author: Jiaxiang Zhang
PDF | Version of Record
Copyright: 2018 The Authors. This is an open access article under the terms of the Creative Commons Attribution LicenseDownload (13.39MB)
Recognizing emotion in faces is important in human interaction and survival, yet existing studies do not paint a consistent picture of the neural representation supporting this task. To address this, we collected magnetoencephalography (MEG) data while participants passively viewed happy, angry and...
|Published in:||Human Brain Mapping|
Check full text
No Tags, Be the first to tag this record!
Recognizing emotion in faces is important in human interaction and survival, yet existing studies do not paint a consistent picture of the neural representation supporting this task. To address this, we collected magnetoencephalography (MEG) data while participants passively viewed happy, angry and neutral faces. Using time-resolved decoding of sensor-level data, we show that responses to angry faces can be discriminated from happy and neutral faces as early as 90 ms after stimulus onset and only 10 ms later than faces can be discriminated from scrambled stimuli, even in the absence of differences in evoked responses. Time-resolved relevance patterns in source space track expression-related information from the visual cortex (100 ms) to higher-level temporal and frontal areas (200–500 ms). Together, our results point to a system optimised for rapid processing of emotional faces and preferentially tuned to threat, consistent with the important evolutionary role that such a system must have played in the development of human social interactions.
face perception, magnetoencephalography (MEG), multivariate pattern analysis (MVPA), threat bias
Faculty of Science and Engineering
Medical Research Council and Engineeringand Physical Sciences Research Council,Grant/Award Number: MR/K00546/