Cargando…

The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception

Recent magneto-encephalographic and electro-encephalographic studies provide evidence for cross-modal integration during audio-visual and audio-haptic speech perception, with speech gestures viewed or felt from manual tactile contact with the speaker’s face. Given the temporal precedence of the hapt...

Descripción completa

Detalles Bibliográficos
Autores principales: Treille, Avril, Vilain, Coriandre, Sato, Marc
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2014
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4026678/
https://www.ncbi.nlm.nih.gov/pubmed/24860533
http://dx.doi.org/10.3389/fpsyg.2014.00420
_version_ 1782316878561017856
author Treille, Avril
Vilain, Coriandre
Sato, Marc
author_facet Treille, Avril
Vilain, Coriandre
Sato, Marc
author_sort Treille, Avril
collection PubMed
description Recent magneto-encephalographic and electro-encephalographic studies provide evidence for cross-modal integration during audio-visual and audio-haptic speech perception, with speech gestures viewed or felt from manual tactile contact with the speaker’s face. Given the temporal precedence of the haptic and visual signals on the acoustic signal in these studies, the observed modulation of N1/P2 auditory evoked responses during bimodal compared to unimodal speech perception suggest that relevant and predictive visual and haptic cues may facilitate auditory speech processing. To further investigate this hypothesis, auditory evoked potentials were here compared during auditory-only, audio-visual and audio-haptic speech perception in live dyadic interactions between a listener and a speaker. In line with previous studies, auditory evoked potentials were attenuated and speeded up during both audio-haptic and audio-visual compared to auditory speech perception. Importantly, the observed latency and amplitude reduction did not significantly depend on the degree of visual and haptic recognition of the speech targets. Altogether, these results further demonstrate cross-modal interactions between the auditory, visual and haptic speech signals. Although they do not contradict the hypothesis that visual and haptic sensory inputs convey predictive information with respect to the incoming auditory speech input, these results suggest that, at least in live conversational interactions, systematic conclusions on sensory predictability in bimodal speech integration have to be taken with caution, with the extraction of predictive cues likely depending on the variability of the speech stimuli.
format Online
Article
Text
id pubmed-4026678
institution National Center for Biotechnology Information
language English
publishDate 2014
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-40266782014-05-23 The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception Treille, Avril Vilain, Coriandre Sato, Marc Front Psychol Psychology Recent magneto-encephalographic and electro-encephalographic studies provide evidence for cross-modal integration during audio-visual and audio-haptic speech perception, with speech gestures viewed or felt from manual tactile contact with the speaker’s face. Given the temporal precedence of the haptic and visual signals on the acoustic signal in these studies, the observed modulation of N1/P2 auditory evoked responses during bimodal compared to unimodal speech perception suggest that relevant and predictive visual and haptic cues may facilitate auditory speech processing. To further investigate this hypothesis, auditory evoked potentials were here compared during auditory-only, audio-visual and audio-haptic speech perception in live dyadic interactions between a listener and a speaker. In line with previous studies, auditory evoked potentials were attenuated and speeded up during both audio-haptic and audio-visual compared to auditory speech perception. Importantly, the observed latency and amplitude reduction did not significantly depend on the degree of visual and haptic recognition of the speech targets. Altogether, these results further demonstrate cross-modal interactions between the auditory, visual and haptic speech signals. Although they do not contradict the hypothesis that visual and haptic sensory inputs convey predictive information with respect to the incoming auditory speech input, these results suggest that, at least in live conversational interactions, systematic conclusions on sensory predictability in bimodal speech integration have to be taken with caution, with the extraction of predictive cues likely depending on the variability of the speech stimuli. Frontiers Media S.A. 2014-05-13 /pmc/articles/PMC4026678/ /pubmed/24860533 http://dx.doi.org/10.3389/fpsyg.2014.00420 Text en Copyright © 2014 Treille, Vilain and Sato. http://creativecommons.org/licenses/by/3.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Treille, Avril
Vilain, Coriandre
Sato, Marc
The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception
title The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception
title_full The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception
title_fullStr The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception
title_full_unstemmed The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception
title_short The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception
title_sort sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4026678/
https://www.ncbi.nlm.nih.gov/pubmed/24860533
http://dx.doi.org/10.3389/fpsyg.2014.00420
work_keys_str_mv AT treilleavril thesoundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception
AT vilaincoriandre thesoundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception
AT satomarc thesoundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception
AT treilleavril soundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception
AT vilaincoriandre soundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception
AT satomarc soundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception