Cargando…

Recognition of Intensive Valence and Arousal Affective States via Facial Electromyographic Activity in Young and Senior Adults

BACKGROUND: Research suggests that interaction between humans and digital environments characterizes a form of companionship in addition to technical convenience. To this effect, humans have attempted to design computer systems able to demonstrably empathize with the human affective experience. Faci...

Descripción completa

Detalles Bibliográficos
Autores principales: Tan, Jun-Wen, Andrade, Adriano O., Li, Hang, Walter, Steffen, Hrabal, David, Rukavina, Stefanie, Limbrecht-Ecklundt, Kerstin, Hoffman, Holger, Traue, Harald C.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4712064/
https://www.ncbi.nlm.nih.gov/pubmed/26761427
http://dx.doi.org/10.1371/journal.pone.0146691
_version_ 1782410005652176896
author Tan, Jun-Wen
Andrade, Adriano O.
Li, Hang
Walter, Steffen
Hrabal, David
Rukavina, Stefanie
Limbrecht-Ecklundt, Kerstin
Hoffman, Holger
Traue, Harald C.
author_facet Tan, Jun-Wen
Andrade, Adriano O.
Li, Hang
Walter, Steffen
Hrabal, David
Rukavina, Stefanie
Limbrecht-Ecklundt, Kerstin
Hoffman, Holger
Traue, Harald C.
author_sort Tan, Jun-Wen
collection PubMed
description BACKGROUND: Research suggests that interaction between humans and digital environments characterizes a form of companionship in addition to technical convenience. To this effect, humans have attempted to design computer systems able to demonstrably empathize with the human affective experience. Facial electromyography (EMG) is one such technique enabling machines to access to human affective states. Numerous studies have investigated the effects of valence emotions on facial EMG activity captured over the corrugator supercilii (frowning muscle) and zygomaticus major (smiling muscle). The arousal emotion, specifically, has not received much research attention, however. In the present study, we sought to identify intensive valence and arousal affective states via facial EMG activity. METHODS: Ten blocks of affective pictures were separated into five categories: neutral valence/low arousal (0VLA), positive valence/high arousal (PVHA), negative valence/high arousal (NVHA), positive valence/low arousal (PVLA), and negative valence/low arousal (NVLA), and the ability of each to elicit corresponding valence and arousal affective states was investigated at length. One hundred and thirteen participants were subjected to these stimuli and provided facial EMG. A set of 16 features based on the amplitude, frequency, predictability, and variability of signals was defined and classified using a support vector machine (SVM). RESULTS: We observed highly accurate classification rates based on the combined corrugator and zygomaticus EMG, ranging from 75.69% to 100.00% for the baseline and five affective states (0VLA, PVHA, PVLA, NVHA, and NVLA) in all individuals. There were significant differences in classification rate accuracy between senior and young adults, but there was no significant difference between female and male participants. CONCLUSION: Our research provides robust evidences for recognition of intensive valence and arousal affective states in young and senior adults. These findings contribute to the successful future application of facial EMG for identifying user affective states in human machine interaction (HMI) or companion robotic systems (CRS).
format Online
Article
Text
id pubmed-4712064
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-47120642016-01-26 Recognition of Intensive Valence and Arousal Affective States via Facial Electromyographic Activity in Young and Senior Adults Tan, Jun-Wen Andrade, Adriano O. Li, Hang Walter, Steffen Hrabal, David Rukavina, Stefanie Limbrecht-Ecklundt, Kerstin Hoffman, Holger Traue, Harald C. PLoS One Research Article BACKGROUND: Research suggests that interaction between humans and digital environments characterizes a form of companionship in addition to technical convenience. To this effect, humans have attempted to design computer systems able to demonstrably empathize with the human affective experience. Facial electromyography (EMG) is one such technique enabling machines to access to human affective states. Numerous studies have investigated the effects of valence emotions on facial EMG activity captured over the corrugator supercilii (frowning muscle) and zygomaticus major (smiling muscle). The arousal emotion, specifically, has not received much research attention, however. In the present study, we sought to identify intensive valence and arousal affective states via facial EMG activity. METHODS: Ten blocks of affective pictures were separated into five categories: neutral valence/low arousal (0VLA), positive valence/high arousal (PVHA), negative valence/high arousal (NVHA), positive valence/low arousal (PVLA), and negative valence/low arousal (NVLA), and the ability of each to elicit corresponding valence and arousal affective states was investigated at length. One hundred and thirteen participants were subjected to these stimuli and provided facial EMG. A set of 16 features based on the amplitude, frequency, predictability, and variability of signals was defined and classified using a support vector machine (SVM). RESULTS: We observed highly accurate classification rates based on the combined corrugator and zygomaticus EMG, ranging from 75.69% to 100.00% for the baseline and five affective states (0VLA, PVHA, PVLA, NVHA, and NVLA) in all individuals. There were significant differences in classification rate accuracy between senior and young adults, but there was no significant difference between female and male participants. CONCLUSION: Our research provides robust evidences for recognition of intensive valence and arousal affective states in young and senior adults. These findings contribute to the successful future application of facial EMG for identifying user affective states in human machine interaction (HMI) or companion robotic systems (CRS). Public Library of Science 2016-01-13 /pmc/articles/PMC4712064/ /pubmed/26761427 http://dx.doi.org/10.1371/journal.pone.0146691 Text en © 2016 Tan et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Tan, Jun-Wen
Andrade, Adriano O.
Li, Hang
Walter, Steffen
Hrabal, David
Rukavina, Stefanie
Limbrecht-Ecklundt, Kerstin
Hoffman, Holger
Traue, Harald C.
Recognition of Intensive Valence and Arousal Affective States via Facial Electromyographic Activity in Young and Senior Adults
title Recognition of Intensive Valence and Arousal Affective States via Facial Electromyographic Activity in Young and Senior Adults
title_full Recognition of Intensive Valence and Arousal Affective States via Facial Electromyographic Activity in Young and Senior Adults
title_fullStr Recognition of Intensive Valence and Arousal Affective States via Facial Electromyographic Activity in Young and Senior Adults
title_full_unstemmed Recognition of Intensive Valence and Arousal Affective States via Facial Electromyographic Activity in Young and Senior Adults
title_short Recognition of Intensive Valence and Arousal Affective States via Facial Electromyographic Activity in Young and Senior Adults
title_sort recognition of intensive valence and arousal affective states via facial electromyographic activity in young and senior adults
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4712064/
https://www.ncbi.nlm.nih.gov/pubmed/26761427
http://dx.doi.org/10.1371/journal.pone.0146691
work_keys_str_mv AT tanjunwen recognitionofintensivevalenceandarousalaffectivestatesviafacialelectromyographicactivityinyoungandsenioradults
AT andradeadrianoo recognitionofintensivevalenceandarousalaffectivestatesviafacialelectromyographicactivityinyoungandsenioradults
AT lihang recognitionofintensivevalenceandarousalaffectivestatesviafacialelectromyographicactivityinyoungandsenioradults
AT waltersteffen recognitionofintensivevalenceandarousalaffectivestatesviafacialelectromyographicactivityinyoungandsenioradults
AT hrabaldavid recognitionofintensivevalenceandarousalaffectivestatesviafacialelectromyographicactivityinyoungandsenioradults
AT rukavinastefanie recognitionofintensivevalenceandarousalaffectivestatesviafacialelectromyographicactivityinyoungandsenioradults
AT limbrechtecklundtkerstin recognitionofintensivevalenceandarousalaffectivestatesviafacialelectromyographicactivityinyoungandsenioradults
AT hoffmanholger recognitionofintensivevalenceandarousalaffectivestatesviafacialelectromyographicactivityinyoungandsenioradults
AT traueharaldc recognitionofintensivevalenceandarousalaffectivestatesviafacialelectromyographicactivityinyoungandsenioradults