Cargando…

Multimodal Recognition of Emotions in Music and Facial Expressions

The aim of the study was to investigate the neural processing of congruent vs. incongruent affective audiovisual information (facial expressions and music) by means of ERPs (Event Related Potentials) recordings. Stimuli were 200 infant faces displaying Happiness, Relaxation, Sadness, Distress and 32...

Descripción completa

Detalles Bibliográficos
Autores principales: Proverbio, Alice Mado, Camporeale, Elisa, Brusa, Alessandra
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7027335/
https://www.ncbi.nlm.nih.gov/pubmed/32116613
http://dx.doi.org/10.3389/fnhum.2020.00032
_version_ 1783498849790197760
author Proverbio, Alice Mado
Camporeale, Elisa
Brusa, Alessandra
author_facet Proverbio, Alice Mado
Camporeale, Elisa
Brusa, Alessandra
author_sort Proverbio, Alice Mado
collection PubMed
description The aim of the study was to investigate the neural processing of congruent vs. incongruent affective audiovisual information (facial expressions and music) by means of ERPs (Event Related Potentials) recordings. Stimuli were 200 infant faces displaying Happiness, Relaxation, Sadness, Distress and 32 piano musical pieces conveying the same emotional states (as specifically assessed). Music and faces were presented simultaneously, and paired so that in half cases they were emotionally congruent or incongruent. Twenty subjects were told to pay attention and respond to infrequent targets (adult neutral faces) while their EEG was recorded from 128 channels. The face-related N170 (160–180 ms) component was the earliest response affected by the emotional content of faces (particularly by distress), while visual P300 (250–450 ms) and auditory N400 (350–550 ms) responses were specifically modulated by the emotional content of both facial expressions and musical pieces. Face/music emotional incongruence elicited a wide N400 negativity indicating the detection of a mismatch in the expressed emotion. A swLORETA inverse solution applied to N400 (difference wave Incong. – Cong.), showed the crucial role of Inferior and Superior Temporal Gyri in the multimodal representation of emotional information extracted from faces and music. Furthermore, the prefrontal cortex (superior and medial, BA 10) was also strongly active, possibly supporting working memory. The data hints at a common system for representing emotional information derived by social cognition and music processing, including uncus and cuneus.
format Online
Article
Text
id pubmed-7027335
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-70273352020-02-28 Multimodal Recognition of Emotions in Music and Facial Expressions Proverbio, Alice Mado Camporeale, Elisa Brusa, Alessandra Front Hum Neurosci Neuroscience The aim of the study was to investigate the neural processing of congruent vs. incongruent affective audiovisual information (facial expressions and music) by means of ERPs (Event Related Potentials) recordings. Stimuli were 200 infant faces displaying Happiness, Relaxation, Sadness, Distress and 32 piano musical pieces conveying the same emotional states (as specifically assessed). Music and faces were presented simultaneously, and paired so that in half cases they were emotionally congruent or incongruent. Twenty subjects were told to pay attention and respond to infrequent targets (adult neutral faces) while their EEG was recorded from 128 channels. The face-related N170 (160–180 ms) component was the earliest response affected by the emotional content of faces (particularly by distress), while visual P300 (250–450 ms) and auditory N400 (350–550 ms) responses were specifically modulated by the emotional content of both facial expressions and musical pieces. Face/music emotional incongruence elicited a wide N400 negativity indicating the detection of a mismatch in the expressed emotion. A swLORETA inverse solution applied to N400 (difference wave Incong. – Cong.), showed the crucial role of Inferior and Superior Temporal Gyri in the multimodal representation of emotional information extracted from faces and music. Furthermore, the prefrontal cortex (superior and medial, BA 10) was also strongly active, possibly supporting working memory. The data hints at a common system for representing emotional information derived by social cognition and music processing, including uncus and cuneus. Frontiers Media S.A. 2020-02-11 /pmc/articles/PMC7027335/ /pubmed/32116613 http://dx.doi.org/10.3389/fnhum.2020.00032 Text en Copyright © 2020 Proverbio, Camporeale and Brusa. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Proverbio, Alice Mado
Camporeale, Elisa
Brusa, Alessandra
Multimodal Recognition of Emotions in Music and Facial Expressions
title Multimodal Recognition of Emotions in Music and Facial Expressions
title_full Multimodal Recognition of Emotions in Music and Facial Expressions
title_fullStr Multimodal Recognition of Emotions in Music and Facial Expressions
title_full_unstemmed Multimodal Recognition of Emotions in Music and Facial Expressions
title_short Multimodal Recognition of Emotions in Music and Facial Expressions
title_sort multimodal recognition of emotions in music and facial expressions
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7027335/
https://www.ncbi.nlm.nih.gov/pubmed/32116613
http://dx.doi.org/10.3389/fnhum.2020.00032
work_keys_str_mv AT proverbioalicemado multimodalrecognitionofemotionsinmusicandfacialexpressions
AT camporealeelisa multimodalrecognitionofemotionsinmusicandfacialexpressions
AT brusaalessandra multimodalrecognitionofemotionsinmusicandfacialexpressions