Cargando…

Integrative interaction of emotional speech in audio-visual modality

Emotional clues are always expressed in many ways in our daily life, and the emotional information we receive is often represented by multiple modalities. Successful social interactions require a combination of multisensory cues to accurately determine the emotion of others. The integration mechanis...

Descripción completa

Detalles Bibliográficos
Autores principales: Dong, Haibin, Li, Na, Fan, Lingzhong, Wei, Jianguo, Xu, Junhai
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9695733/
https://www.ncbi.nlm.nih.gov/pubmed/36440282
http://dx.doi.org/10.3389/fnins.2022.797277
_version_ 1784838136512118784
author Dong, Haibin
Li, Na
Fan, Lingzhong
Wei, Jianguo
Xu, Junhai
author_facet Dong, Haibin
Li, Na
Fan, Lingzhong
Wei, Jianguo
Xu, Junhai
author_sort Dong, Haibin
collection PubMed
description Emotional clues are always expressed in many ways in our daily life, and the emotional information we receive is often represented by multiple modalities. Successful social interactions require a combination of multisensory cues to accurately determine the emotion of others. The integration mechanism of multimodal emotional information has been widely investigated. Different brain activity measurement methods were used to determine the location of brain regions involved in the audio-visual integration of emotional information, mainly in the bilateral superior temporal regions. However, the methods adopted in these studies are relatively simple, and the materials of the study rarely contain speech information. The integration mechanism of emotional speech in the human brain still needs further examinations. In this paper, a functional magnetic resonance imaging (fMRI) study was conducted using event-related design to explore the audio-visual integration mechanism of emotional speech in the human brain by using dynamic facial expressions and emotional speech to express emotions of different valences. Representational similarity analysis (RSA) based on regions of interest (ROIs), whole brain searchlight analysis, modality conjunction analysis and supra-additive analysis were used to analyze and verify the role of relevant brain regions. Meanwhile, a weighted RSA method was used to evaluate the contributions of each candidate model in the best fitted model of ROIs. The results showed that only the left insula was detected by all methods, suggesting that the left insula played an important role in the audio-visual integration of emotional speech. Whole brain searchlight analysis, modality conjunction analysis and supra-additive analysis together revealed that the bilateral middle temporal gyrus (MTG), right inferior parietal lobule and bilateral precuneus might be involved in the audio-visual integration of emotional speech from other aspects.
format Online
Article
Text
id pubmed-9695733
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-96957332022-11-26 Integrative interaction of emotional speech in audio-visual modality Dong, Haibin Li, Na Fan, Lingzhong Wei, Jianguo Xu, Junhai Front Neurosci Neuroscience Emotional clues are always expressed in many ways in our daily life, and the emotional information we receive is often represented by multiple modalities. Successful social interactions require a combination of multisensory cues to accurately determine the emotion of others. The integration mechanism of multimodal emotional information has been widely investigated. Different brain activity measurement methods were used to determine the location of brain regions involved in the audio-visual integration of emotional information, mainly in the bilateral superior temporal regions. However, the methods adopted in these studies are relatively simple, and the materials of the study rarely contain speech information. The integration mechanism of emotional speech in the human brain still needs further examinations. In this paper, a functional magnetic resonance imaging (fMRI) study was conducted using event-related design to explore the audio-visual integration mechanism of emotional speech in the human brain by using dynamic facial expressions and emotional speech to express emotions of different valences. Representational similarity analysis (RSA) based on regions of interest (ROIs), whole brain searchlight analysis, modality conjunction analysis and supra-additive analysis were used to analyze and verify the role of relevant brain regions. Meanwhile, a weighted RSA method was used to evaluate the contributions of each candidate model in the best fitted model of ROIs. The results showed that only the left insula was detected by all methods, suggesting that the left insula played an important role in the audio-visual integration of emotional speech. Whole brain searchlight analysis, modality conjunction analysis and supra-additive analysis together revealed that the bilateral middle temporal gyrus (MTG), right inferior parietal lobule and bilateral precuneus might be involved in the audio-visual integration of emotional speech from other aspects. Frontiers Media S.A. 2022-11-11 /pmc/articles/PMC9695733/ /pubmed/36440282 http://dx.doi.org/10.3389/fnins.2022.797277 Text en Copyright © 2022 Dong, Li, Fan, Wei and Xu. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Dong, Haibin
Li, Na
Fan, Lingzhong
Wei, Jianguo
Xu, Junhai
Integrative interaction of emotional speech in audio-visual modality
title Integrative interaction of emotional speech in audio-visual modality
title_full Integrative interaction of emotional speech in audio-visual modality
title_fullStr Integrative interaction of emotional speech in audio-visual modality
title_full_unstemmed Integrative interaction of emotional speech in audio-visual modality
title_short Integrative interaction of emotional speech in audio-visual modality
title_sort integrative interaction of emotional speech in audio-visual modality
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9695733/
https://www.ncbi.nlm.nih.gov/pubmed/36440282
http://dx.doi.org/10.3389/fnins.2022.797277
work_keys_str_mv AT donghaibin integrativeinteractionofemotionalspeechinaudiovisualmodality
AT lina integrativeinteractionofemotionalspeechinaudiovisualmodality
AT fanlingzhong integrativeinteractionofemotionalspeechinaudiovisualmodality
AT weijianguo integrativeinteractionofemotionalspeechinaudiovisualmodality
AT xujunhai integrativeinteractionofemotionalspeechinaudiovisualmodality