Cargando…

Audience facial expressions detected by automated face analysis software reflect emotions in music

An abundance of studies on emotional experiences in response to music have been published over the past decades, however, most have been carried out in controlled laboratory settings and rely on subjective reports. Facial expressions have been occasionally assessed but measured using intrusive metho...

Descripción completa

Detalles Bibliográficos
Autores principales: Kayser, Diana, Egermann, Hauke, Barraclough, Nick E.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer US 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9170626/
https://www.ncbi.nlm.nih.gov/pubmed/34508286
http://dx.doi.org/10.3758/s13428-021-01678-3
_version_ 1784721474660073472
author Kayser, Diana
Egermann, Hauke
Barraclough, Nick E.
author_facet Kayser, Diana
Egermann, Hauke
Barraclough, Nick E.
author_sort Kayser, Diana
collection PubMed
description An abundance of studies on emotional experiences in response to music have been published over the past decades, however, most have been carried out in controlled laboratory settings and rely on subjective reports. Facial expressions have been occasionally assessed but measured using intrusive methods such as facial electromyography (fEMG). The present study investigated emotional experiences of fifty participants in a live concert. Our aims were to explore whether automated face analysis could detect facial expressions of emotion in a group of people in an ecologically valid listening context, to determine whether emotions expressed by the music predicted specific facial expressions and examine whether facial expressions of emotion could be used to predict subjective ratings of pleasantness and activation. During the concert, participants were filmed and facial expressions were subsequently analyzed with automated face analysis software. Self-report on participants’ subjective experience of pleasantness and activation were collected after the concert for all pieces (two happy, two sad). Our results show that the pieces that expressed sadness resulted in more facial expressions of sadness (compared to happiness), whereas the pieces that expressed happiness resulted in more facial expressions of happiness (compared to sadness). Differences for other facial expression categories (anger, fear, surprise, disgust, and neutral) were not found. Independent of the musical piece or emotion expressed in the music facial expressions of happiness predicted ratings of subjectively felt pleasantness, whilst facial expressions of sadness and disgust predicted low and high ratings of subjectively felt activation, respectively. Together, our results show that non-invasive measurements of audience facial expressions in a naturalistic concert setting are indicative of emotions expressed by the music, and the subjective experiences of the audience members themselves.
format Online
Article
Text
id pubmed-9170626
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Springer US
record_format MEDLINE/PubMed
spelling pubmed-91706262022-06-08 Audience facial expressions detected by automated face analysis software reflect emotions in music Kayser, Diana Egermann, Hauke Barraclough, Nick E. Behav Res Methods Article An abundance of studies on emotional experiences in response to music have been published over the past decades, however, most have been carried out in controlled laboratory settings and rely on subjective reports. Facial expressions have been occasionally assessed but measured using intrusive methods such as facial electromyography (fEMG). The present study investigated emotional experiences of fifty participants in a live concert. Our aims were to explore whether automated face analysis could detect facial expressions of emotion in a group of people in an ecologically valid listening context, to determine whether emotions expressed by the music predicted specific facial expressions and examine whether facial expressions of emotion could be used to predict subjective ratings of pleasantness and activation. During the concert, participants were filmed and facial expressions were subsequently analyzed with automated face analysis software. Self-report on participants’ subjective experience of pleasantness and activation were collected after the concert for all pieces (two happy, two sad). Our results show that the pieces that expressed sadness resulted in more facial expressions of sadness (compared to happiness), whereas the pieces that expressed happiness resulted in more facial expressions of happiness (compared to sadness). Differences for other facial expression categories (anger, fear, surprise, disgust, and neutral) were not found. Independent of the musical piece or emotion expressed in the music facial expressions of happiness predicted ratings of subjectively felt pleasantness, whilst facial expressions of sadness and disgust predicted low and high ratings of subjectively felt activation, respectively. Together, our results show that non-invasive measurements of audience facial expressions in a naturalistic concert setting are indicative of emotions expressed by the music, and the subjective experiences of the audience members themselves. Springer US 2021-09-10 2022 /pmc/articles/PMC9170626/ /pubmed/34508286 http://dx.doi.org/10.3758/s13428-021-01678-3 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Kayser, Diana
Egermann, Hauke
Barraclough, Nick E.
Audience facial expressions detected by automated face analysis software reflect emotions in music
title Audience facial expressions detected by automated face analysis software reflect emotions in music
title_full Audience facial expressions detected by automated face analysis software reflect emotions in music
title_fullStr Audience facial expressions detected by automated face analysis software reflect emotions in music
title_full_unstemmed Audience facial expressions detected by automated face analysis software reflect emotions in music
title_short Audience facial expressions detected by automated face analysis software reflect emotions in music
title_sort audience facial expressions detected by automated face analysis software reflect emotions in music
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9170626/
https://www.ncbi.nlm.nih.gov/pubmed/34508286
http://dx.doi.org/10.3758/s13428-021-01678-3
work_keys_str_mv AT kayserdiana audiencefacialexpressionsdetectedbyautomatedfaceanalysissoftwarereflectemotionsinmusic
AT egermannhauke audiencefacialexpressionsdetectedbyautomatedfaceanalysissoftwarereflectemotionsinmusic
AT barracloughnicke audiencefacialexpressionsdetectedbyautomatedfaceanalysissoftwarereflectemotionsinmusic