Cargando…

Compensating for age limits through emotional crossmodal integration

Social interactions in daily life necessitate the integration of social signals from different sensory modalities. In the aging literature, it is well established that the recognition of emotion in facial expressions declines with advancing age, and this also occurs with vocal expressions. By contra...

Descripción completa

Detalles Bibliográficos
Autores principales: Chaby, Laurence, Boullay, Viviane Luherne-du, Chetouani, Mohamed, Plaza, Monique
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4445247/
https://www.ncbi.nlm.nih.gov/pubmed/26074845
http://dx.doi.org/10.3389/fpsyg.2015.00691
_version_ 1782373257827057664
author Chaby, Laurence
Boullay, Viviane Luherne-du
Chetouani, Mohamed
Plaza, Monique
author_facet Chaby, Laurence
Boullay, Viviane Luherne-du
Chetouani, Mohamed
Plaza, Monique
author_sort Chaby, Laurence
collection PubMed
description Social interactions in daily life necessitate the integration of social signals from different sensory modalities. In the aging literature, it is well established that the recognition of emotion in facial expressions declines with advancing age, and this also occurs with vocal expressions. By contrast, crossmodal integration processing in healthy aging individuals is less documented. Here, we investigated the age-related effects on emotion recognition when faces and voices were presented alone or simultaneously, allowing for crossmodal integration. In this study, 31 young adults (M = 25.8 years) and 31 older adults (M = 67.2 years) were instructed to identify several basic emotions (happiness, sadness, anger, fear, disgust) and a neutral expression, which were displayed as visual (facial expressions), auditory (non-verbal affective vocalizations) or crossmodal (simultaneous, congruent facial and vocal affective expressions) stimuli. The results showed that older adults performed slower and worse than younger adults at recognizing negative emotions from isolated faces and voices. In the crossmodal condition, although slower, older adults were as accurate as younger except for anger. Importantly, additional analyses using the “race model” demonstrate that older adults benefited to the same extent as younger adults from the combination of facial and vocal emotional stimuli. These results help explain some conflicting results in the literature and may clarify emotional abilities related to daily life that are partially spared among older adults.
format Online
Article
Text
id pubmed-4445247
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-44452472015-06-12 Compensating for age limits through emotional crossmodal integration Chaby, Laurence Boullay, Viviane Luherne-du Chetouani, Mohamed Plaza, Monique Front Psychol Psychology Social interactions in daily life necessitate the integration of social signals from different sensory modalities. In the aging literature, it is well established that the recognition of emotion in facial expressions declines with advancing age, and this also occurs with vocal expressions. By contrast, crossmodal integration processing in healthy aging individuals is less documented. Here, we investigated the age-related effects on emotion recognition when faces and voices were presented alone or simultaneously, allowing for crossmodal integration. In this study, 31 young adults (M = 25.8 years) and 31 older adults (M = 67.2 years) were instructed to identify several basic emotions (happiness, sadness, anger, fear, disgust) and a neutral expression, which were displayed as visual (facial expressions), auditory (non-verbal affective vocalizations) or crossmodal (simultaneous, congruent facial and vocal affective expressions) stimuli. The results showed that older adults performed slower and worse than younger adults at recognizing negative emotions from isolated faces and voices. In the crossmodal condition, although slower, older adults were as accurate as younger except for anger. Importantly, additional analyses using the “race model” demonstrate that older adults benefited to the same extent as younger adults from the combination of facial and vocal emotional stimuli. These results help explain some conflicting results in the literature and may clarify emotional abilities related to daily life that are partially spared among older adults. Frontiers Media S.A. 2015-05-27 /pmc/articles/PMC4445247/ /pubmed/26074845 http://dx.doi.org/10.3389/fpsyg.2015.00691 Text en Copyright © 2015 Chaby, Luherne-du Boullay, Chetouani and Plaza. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Chaby, Laurence
Boullay, Viviane Luherne-du
Chetouani, Mohamed
Plaza, Monique
Compensating for age limits through emotional crossmodal integration
title Compensating for age limits through emotional crossmodal integration
title_full Compensating for age limits through emotional crossmodal integration
title_fullStr Compensating for age limits through emotional crossmodal integration
title_full_unstemmed Compensating for age limits through emotional crossmodal integration
title_short Compensating for age limits through emotional crossmodal integration
title_sort compensating for age limits through emotional crossmodal integration
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4445247/
https://www.ncbi.nlm.nih.gov/pubmed/26074845
http://dx.doi.org/10.3389/fpsyg.2015.00691
work_keys_str_mv AT chabylaurence compensatingforagelimitsthroughemotionalcrossmodalintegration
AT boullayvivianeluhernedu compensatingforagelimitsthroughemotionalcrossmodalintegration
AT chetouanimohamed compensatingforagelimitsthroughemotionalcrossmodalintegration
AT plazamonique compensatingforagelimitsthroughemotionalcrossmodalintegration