Cargando…

Hearing, seeing, and feeling speech: the neurophysiological correlates of trimodal speech perception

INTRODUCTION: To perceive speech, our brains process information from different sensory modalities. Previous electroencephalography (EEG) research has established that audio-visual information provides an advantage compared to auditory-only information during early auditory processing. In addition,...

Descripción completa

Detalles Bibliográficos
Autores principales: Hansmann, Doreen, Derrick, Donald, Theys, Catherine
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10495990/
https://www.ncbi.nlm.nih.gov/pubmed/37706173
http://dx.doi.org/10.3389/fnhum.2023.1225976
_version_ 1785105012627603456
author Hansmann, Doreen
Derrick, Donald
Theys, Catherine
author_facet Hansmann, Doreen
Derrick, Donald
Theys, Catherine
author_sort Hansmann, Doreen
collection PubMed
description INTRODUCTION: To perceive speech, our brains process information from different sensory modalities. Previous electroencephalography (EEG) research has established that audio-visual information provides an advantage compared to auditory-only information during early auditory processing. In addition, behavioral research showed that auditory speech perception is not only enhanced by visual information but also by tactile information, transmitted by puffs of air arriving at the skin and aligned with speech. The current EEG study aimed to investigate whether the behavioral benefits of bimodal audio-aerotactile and trimodal audio-visual-aerotactile speech presentation are reflected in cortical auditory event-related neurophysiological responses. METHODS: To examine the influence of multimodal information on speech perception, 20 listeners conducted a two-alternative forced-choice syllable identification task at three different signal-to-noise levels. RESULTS: Behavioral results showed increased syllable identification accuracy when auditory information was complemented with visual information, but did not show the same effect for the addition of tactile information. Similarly, EEG results showed an amplitude suppression for the auditory N1 and P2 event-related potentials for the audio-visual and audio-visual-aerotactile modalities compared to auditory and audio-aerotactile presentations of the syllable/pa/. No statistically significant difference was present between audio-aerotactile and auditory-only modalities. DISCUSSION: Current findings are consistent with past EEG research showing a visually induced amplitude suppression during early auditory processing. In addition, the significant neurophysiological effect of audio-visual but not audio-aerotactile presentation is in line with the large benefit of visual information but comparatively much smaller effect of aerotactile information on auditory speech perception previously identified in behavioral research.
format Online
Article
Text
id pubmed-10495990
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-104959902023-09-13 Hearing, seeing, and feeling speech: the neurophysiological correlates of trimodal speech perception Hansmann, Doreen Derrick, Donald Theys, Catherine Front Hum Neurosci Neuroscience INTRODUCTION: To perceive speech, our brains process information from different sensory modalities. Previous electroencephalography (EEG) research has established that audio-visual information provides an advantage compared to auditory-only information during early auditory processing. In addition, behavioral research showed that auditory speech perception is not only enhanced by visual information but also by tactile information, transmitted by puffs of air arriving at the skin and aligned with speech. The current EEG study aimed to investigate whether the behavioral benefits of bimodal audio-aerotactile and trimodal audio-visual-aerotactile speech presentation are reflected in cortical auditory event-related neurophysiological responses. METHODS: To examine the influence of multimodal information on speech perception, 20 listeners conducted a two-alternative forced-choice syllable identification task at three different signal-to-noise levels. RESULTS: Behavioral results showed increased syllable identification accuracy when auditory information was complemented with visual information, but did not show the same effect for the addition of tactile information. Similarly, EEG results showed an amplitude suppression for the auditory N1 and P2 event-related potentials for the audio-visual and audio-visual-aerotactile modalities compared to auditory and audio-aerotactile presentations of the syllable/pa/. No statistically significant difference was present between audio-aerotactile and auditory-only modalities. DISCUSSION: Current findings are consistent with past EEG research showing a visually induced amplitude suppression during early auditory processing. In addition, the significant neurophysiological effect of audio-visual but not audio-aerotactile presentation is in line with the large benefit of visual information but comparatively much smaller effect of aerotactile information on auditory speech perception previously identified in behavioral research. Frontiers Media S.A. 2023-08-29 /pmc/articles/PMC10495990/ /pubmed/37706173 http://dx.doi.org/10.3389/fnhum.2023.1225976 Text en Copyright © 2023 Hansmann, Derrick and Theys. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Hansmann, Doreen
Derrick, Donald
Theys, Catherine
Hearing, seeing, and feeling speech: the neurophysiological correlates of trimodal speech perception
title Hearing, seeing, and feeling speech: the neurophysiological correlates of trimodal speech perception
title_full Hearing, seeing, and feeling speech: the neurophysiological correlates of trimodal speech perception
title_fullStr Hearing, seeing, and feeling speech: the neurophysiological correlates of trimodal speech perception
title_full_unstemmed Hearing, seeing, and feeling speech: the neurophysiological correlates of trimodal speech perception
title_short Hearing, seeing, and feeling speech: the neurophysiological correlates of trimodal speech perception
title_sort hearing, seeing, and feeling speech: the neurophysiological correlates of trimodal speech perception
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10495990/
https://www.ncbi.nlm.nih.gov/pubmed/37706173
http://dx.doi.org/10.3389/fnhum.2023.1225976
work_keys_str_mv AT hansmanndoreen hearingseeingandfeelingspeechtheneurophysiologicalcorrelatesoftrimodalspeechperception
AT derrickdonald hearingseeingandfeelingspeechtheneurophysiologicalcorrelatesoftrimodalspeechperception
AT theyscatherine hearingseeingandfeelingspeechtheneurophysiologicalcorrelatesoftrimodalspeechperception