Cargando…

Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements

Giving a robot the ability to perceive emotion in its environment can improve human-robot interaction (HRI), thereby facilitating more human-like communication. To achieve emotion recognition in different built environments for adolescents, we propose a multi-modal emotion intensity perception metho...

Descripción completa

Detalles Bibliográficos
Autores principales: Su, Yuanyuan, Li, Wenchao, Bi, Ning, Lv, Zhao
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6606730/
https://www.ncbi.nlm.nih.gov/pubmed/31293410
http://dx.doi.org/10.3389/fnbot.2019.00046
_version_ 1783431955741671424
author Su, Yuanyuan
Li, Wenchao
Bi, Ning
Lv, Zhao
author_facet Su, Yuanyuan
Li, Wenchao
Bi, Ning
Lv, Zhao
author_sort Su, Yuanyuan
collection PubMed
description Giving a robot the ability to perceive emotion in its environment can improve human-robot interaction (HRI), thereby facilitating more human-like communication. To achieve emotion recognition in different built environments for adolescents, we propose a multi-modal emotion intensity perception method using an integration of electroencephalography (EEG) and eye movement information. Specifically, we first develop a new stimulus video selection method based on computation of normalized arousal and valence scores according to subjective feedback from participants. Then, we establish a valence perception sub-model and an arousal sub-model by collecting and analyzing emotional EEG and eye movement signals, respectively. We employ this dual recognition method to perceive emotional intensities synchronously in two dimensions. In the laboratory environment, the best recognition accuracies of the modality fusion for the arousal and valence dimensions are 72.8 and 69.3%. The experimental results validate the feasibility of the proposed multi-modal emotion recognition method for environment emotion intensity perception. This promising tool not only achieves more accurate emotion perception for HRI systems but also provides an alternative approach to quantitatively assess environmental psychology.
format Online
Article
Text
id pubmed-6606730
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-66067302019-07-10 Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements Su, Yuanyuan Li, Wenchao Bi, Ning Lv, Zhao Front Neurorobot Robotics and AI Giving a robot the ability to perceive emotion in its environment can improve human-robot interaction (HRI), thereby facilitating more human-like communication. To achieve emotion recognition in different built environments for adolescents, we propose a multi-modal emotion intensity perception method using an integration of electroencephalography (EEG) and eye movement information. Specifically, we first develop a new stimulus video selection method based on computation of normalized arousal and valence scores according to subjective feedback from participants. Then, we establish a valence perception sub-model and an arousal sub-model by collecting and analyzing emotional EEG and eye movement signals, respectively. We employ this dual recognition method to perceive emotional intensities synchronously in two dimensions. In the laboratory environment, the best recognition accuracies of the modality fusion for the arousal and valence dimensions are 72.8 and 69.3%. The experimental results validate the feasibility of the proposed multi-modal emotion recognition method for environment emotion intensity perception. This promising tool not only achieves more accurate emotion perception for HRI systems but also provides an alternative approach to quantitatively assess environmental psychology. Frontiers Media S.A. 2019-06-26 /pmc/articles/PMC6606730/ /pubmed/31293410 http://dx.doi.org/10.3389/fnbot.2019.00046 Text en Copyright © 2019 Su, Li, Bi and Lv. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Robotics and AI
Su, Yuanyuan
Li, Wenchao
Bi, Ning
Lv, Zhao
Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements
title Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements
title_full Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements
title_fullStr Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements
title_full_unstemmed Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements
title_short Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements
title_sort adolescents environmental emotion perception by integrating eeg and eye movements
topic Robotics and AI
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6606730/
https://www.ncbi.nlm.nih.gov/pubmed/31293410
http://dx.doi.org/10.3389/fnbot.2019.00046
work_keys_str_mv AT suyuanyuan adolescentsenvironmentalemotionperceptionbyintegratingeegandeyemovements
AT liwenchao adolescentsenvironmentalemotionperceptionbyintegratingeegandeyemovements
AT bining adolescentsenvironmentalemotionperceptionbyintegratingeegandeyemovements
AT lvzhao adolescentsenvironmentalemotionperceptionbyintegratingeegandeyemovements