Facial EMG sensing for monitoring affect using a wearable device

Using a novel wearable surface electromyography (sEMG), we investigated induced affective states by measuring the activation of facial muscles traditionally associated with positive (left/right orbicularis and left/right zygomaticus) and negative expressions (the corrugator muscle). In a sample of 3...

Descripción completa

Detalles Bibliográficos
Autores principales: Gjoreski, Martin, Kiprijanovska, Ivana, Stankoski, Simon, Mavridou, Ifigeneia, Broulidakis, M. John, Gjoreski, Hristijan, Nduka, Charles
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9542454/
https://www.ncbi.nlm.nih.gov/pubmed/36207524
http://dx.doi.org/10.1038/s41598-022-21456-1
_version_ 1784804152764792832
author Gjoreski, Martin
Kiprijanovska, Ivana
Stankoski, Simon
Mavridou, Ifigeneia
Broulidakis, M. John
Gjoreski, Hristijan
Nduka, Charles
author_facet Gjoreski, Martin
Kiprijanovska, Ivana
Stankoski, Simon
Mavridou, Ifigeneia
Broulidakis, M. John
Gjoreski, Hristijan
Nduka, Charles
author_sort Gjoreski, Martin
collection PubMed
description Using a novel wearable surface electromyography (sEMG), we investigated induced affective states by measuring the activation of facial muscles traditionally associated with positive (left/right orbicularis and left/right zygomaticus) and negative expressions (the corrugator muscle). In a sample of 38 participants that watched 25 affective videos in a virtual reality environment, we found that each of the three variables examined—subjective valence, subjective arousal, and objective valence measured via the validated video types (positive, neutral, and negative)—sEMG amplitude varied significantly depending on video content. sEMG aptitude from “positive muscles” increased when participants were exposed to positively valenced stimuli compared with stimuli that was negatively valenced. In contrast, activation of “negative muscles” was elevated following exposure to negatively valenced stimuli compared with positively valenced stimuli. High arousal videos increased muscle activations compared to low arousal videos in all the measured muscles except the corrugator muscle. In line with previous research, the relationship between sEMG amplitude as a function of subjective valence was V-shaped.
format Online
Article
Text
id pubmed-9542454
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-95424542022-10-09 Facial EMG sensing for monitoring affect using a wearable device Gjoreski, Martin Kiprijanovska, Ivana Stankoski, Simon Mavridou, Ifigeneia Broulidakis, M. John Gjoreski, Hristijan Nduka, Charles Sci Rep Article Using a novel wearable surface electromyography (sEMG), we investigated induced affective states by measuring the activation of facial muscles traditionally associated with positive (left/right orbicularis and left/right zygomaticus) and negative expressions (the corrugator muscle). In a sample of 38 participants that watched 25 affective videos in a virtual reality environment, we found that each of the three variables examined—subjective valence, subjective arousal, and objective valence measured via the validated video types (positive, neutral, and negative)—sEMG amplitude varied significantly depending on video content. sEMG aptitude from “positive muscles” increased when participants were exposed to positively valenced stimuli compared with stimuli that was negatively valenced. In contrast, activation of “negative muscles” was elevated following exposure to negatively valenced stimuli compared with positively valenced stimuli. High arousal videos increased muscle activations compared to low arousal videos in all the measured muscles except the corrugator muscle. In line with previous research, the relationship between sEMG amplitude as a function of subjective valence was V-shaped. Nature Publishing Group UK 2022-10-07 /pmc/articles/PMC9542454/ /pubmed/36207524 http://dx.doi.org/10.1038/s41598-022-21456-1 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Gjoreski, Martin
Kiprijanovska, Ivana
Stankoski, Simon
Mavridou, Ifigeneia
Broulidakis, M. John
Gjoreski, Hristijan
Nduka, Charles
Facial EMG sensing for monitoring affect using a wearable device
title Facial EMG sensing for monitoring affect using a wearable device
title_full Facial EMG sensing for monitoring affect using a wearable device
title_fullStr Facial EMG sensing for monitoring affect using a wearable device
title_full_unstemmed Facial EMG sensing for monitoring affect using a wearable device
title_short Facial EMG sensing for monitoring affect using a wearable device
title_sort facial emg sensing for monitoring affect using a wearable device
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9542454/
https://www.ncbi.nlm.nih.gov/pubmed/36207524
http://dx.doi.org/10.1038/s41598-022-21456-1
work_keys_str_mv AT gjoreskimartin facialemgsensingformonitoringaffectusingawearabledevice
AT kiprijanovskaivana facialemgsensingformonitoringaffectusingawearabledevice
AT stankoskisimon facialemgsensingformonitoringaffectusingawearabledevice
AT mavridouifigeneia facialemgsensingformonitoringaffectusingawearabledevice
AT broulidakismjohn facialemgsensingformonitoringaffectusingawearabledevice
AT gjoreskihristijan facialemgsensingformonitoringaffectusingawearabledevice
AT ndukacharles facialemgsensingformonitoringaffectusingawearabledevice