Cargando…
Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems
Facing the adolescents and detecting their emotional state is vital for promoting rehabilitation therapy within an E-Healthcare system. Focusing on a novel approach for a sensor-based E-Healthcare system, we propose an eye movement information-based emotion perception algorithm by collecting and ana...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6164228/ https://www.ncbi.nlm.nih.gov/pubmed/30150554 http://dx.doi.org/10.3390/s18092826 |
_version_ | 1783359549359521792 |
---|---|
author | Wang, Yang Lv, Zhao Zheng, Yongjun |
author_facet | Wang, Yang Lv, Zhao Zheng, Yongjun |
author_sort | Wang, Yang |
collection | PubMed |
description | Facing the adolescents and detecting their emotional state is vital for promoting rehabilitation therapy within an E-Healthcare system. Focusing on a novel approach for a sensor-based E-Healthcare system, we propose an eye movement information-based emotion perception algorithm by collecting and analyzing electrooculography (EOG) signals and eye movement video synchronously. Specifically, we extract the time-frequency eye movement features by firstly applying the short-time Fourier transform (STFT) to raw multi-channel EOG signals. Subsequently, in order to integrate time domain eye movement features (i.e., saccade duration, fixation duration, and pupil diameter), we investigate two feature fusion strategies: feature level fusion (FLF) and decision level fusion (DLF). Recognition experiments have been also performed according to three emotional states: positive, neutral, and negative. The average accuracies are 88.64% (the FLF method) and 88.35% (the DLF with maximal rule method), respectively. Experimental results reveal that eye movement information can effectively reflect the emotional state of the adolescences, which provides a promising tool to improve the performance of the E-Healthcare system. |
format | Online Article Text |
id | pubmed-6164228 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2018 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-61642282018-10-10 Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems Wang, Yang Lv, Zhao Zheng, Yongjun Sensors (Basel) Article Facing the adolescents and detecting their emotional state is vital for promoting rehabilitation therapy within an E-Healthcare system. Focusing on a novel approach for a sensor-based E-Healthcare system, we propose an eye movement information-based emotion perception algorithm by collecting and analyzing electrooculography (EOG) signals and eye movement video synchronously. Specifically, we extract the time-frequency eye movement features by firstly applying the short-time Fourier transform (STFT) to raw multi-channel EOG signals. Subsequently, in order to integrate time domain eye movement features (i.e., saccade duration, fixation duration, and pupil diameter), we investigate two feature fusion strategies: feature level fusion (FLF) and decision level fusion (DLF). Recognition experiments have been also performed according to three emotional states: positive, neutral, and negative. The average accuracies are 88.64% (the FLF method) and 88.35% (the DLF with maximal rule method), respectively. Experimental results reveal that eye movement information can effectively reflect the emotional state of the adolescences, which provides a promising tool to improve the performance of the E-Healthcare system. MDPI 2018-08-27 /pmc/articles/PMC6164228/ /pubmed/30150554 http://dx.doi.org/10.3390/s18092826 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Wang, Yang Lv, Zhao Zheng, Yongjun Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems |
title | Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems |
title_full | Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems |
title_fullStr | Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems |
title_full_unstemmed | Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems |
title_short | Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems |
title_sort | automatic emotion perception using eye movement information for e-healthcare systems |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6164228/ https://www.ncbi.nlm.nih.gov/pubmed/30150554 http://dx.doi.org/10.3390/s18092826 |
work_keys_str_mv | AT wangyang automaticemotionperceptionusingeyemovementinformationforehealthcaresystems AT lvzhao automaticemotionperceptionusingeyemovementinformationforehealthcaresystems AT zhengyongjun automaticemotionperceptionusingeyemovementinformationforehealthcaresystems |