Cargando…
The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs
A cochlear implant (CI) is an auditory prosthesis which can partially restore the auditory function in patients with severe to profound hearing loss. However, this bionic device provides only limited auditory information, and CI patients may compensate for this limitation by means of a stronger inte...
Autores principales: | , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Elsevier
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8927996/ https://www.ncbi.nlm.nih.gov/pubmed/35303598 http://dx.doi.org/10.1016/j.nicl.2022.102982 |
_version_ | 1784670565448024064 |
---|---|
author | Layer, Natalie Weglage, Anna Müller, Verena Meister, Hartmut Lang-Roth, Ruth Walger, Martin Murray, Micah M. Sandmann, Pascale |
author_facet | Layer, Natalie Weglage, Anna Müller, Verena Meister, Hartmut Lang-Roth, Ruth Walger, Martin Murray, Micah M. Sandmann, Pascale |
author_sort | Layer, Natalie |
collection | PubMed |
description | A cochlear implant (CI) is an auditory prosthesis which can partially restore the auditory function in patients with severe to profound hearing loss. However, this bionic device provides only limited auditory information, and CI patients may compensate for this limitation by means of a stronger interaction between the auditory and visual system. To better understand the electrophysiological correlates of audiovisual speech perception, the present study used electroencephalography (EEG) and a redundant target paradigm. Postlingually deafened CI users and normal-hearing (NH) listeners were compared in auditory, visual and audiovisual speech conditions. The behavioural results revealed multisensory integration for both groups, as indicated by shortened response times for the audiovisual as compared to the two unisensory conditions. The analysis of the N1 and P2 event-related potentials (ERPs), including topographic and source analyses, confirmed a multisensory effect for both groups and showed a cortical auditory response which was modulated by the simultaneous processing of the visual stimulus. Nevertheless, the CI users in particular revealed a distinct pattern of N1 topography, pointing to a strong visual impact on auditory speech processing. Apart from these condition effects, the results revealed ERP differences between CI users and NH listeners, not only in N1/P2 ERP topographies, but also in the cortical source configuration. When compared to the NH listeners, the CI users showed an additional activation in the visual cortex at N1 latency, which was positively correlated with CI experience, and a delayed auditory-cortex activation with a reversed, rightward functional lateralisation. In sum, our behavioural and ERP findings demonstrate a clear audiovisual benefit for both groups, and a CI-specific alteration in cortical activation at N1 latency when auditory and visual input is combined. These cortical alterations may reflect a compensatory strategy to overcome the limited CI input, which allows the CI users to improve the lip-reading skills and to approximate the behavioural performance of NH listeners in audiovisual speech conditions. Our results are clinically relevant, as they highlight the importance of assessing the CI outcome not only in auditory-only, but also in audiovisual speech conditions. |
format | Online Article Text |
id | pubmed-8927996 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Elsevier |
record_format | MEDLINE/PubMed |
spelling | pubmed-89279962022-03-18 The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs Layer, Natalie Weglage, Anna Müller, Verena Meister, Hartmut Lang-Roth, Ruth Walger, Martin Murray, Micah M. Sandmann, Pascale Neuroimage Clin Regular Article A cochlear implant (CI) is an auditory prosthesis which can partially restore the auditory function in patients with severe to profound hearing loss. However, this bionic device provides only limited auditory information, and CI patients may compensate for this limitation by means of a stronger interaction between the auditory and visual system. To better understand the electrophysiological correlates of audiovisual speech perception, the present study used electroencephalography (EEG) and a redundant target paradigm. Postlingually deafened CI users and normal-hearing (NH) listeners were compared in auditory, visual and audiovisual speech conditions. The behavioural results revealed multisensory integration for both groups, as indicated by shortened response times for the audiovisual as compared to the two unisensory conditions. The analysis of the N1 and P2 event-related potentials (ERPs), including topographic and source analyses, confirmed a multisensory effect for both groups and showed a cortical auditory response which was modulated by the simultaneous processing of the visual stimulus. Nevertheless, the CI users in particular revealed a distinct pattern of N1 topography, pointing to a strong visual impact on auditory speech processing. Apart from these condition effects, the results revealed ERP differences between CI users and NH listeners, not only in N1/P2 ERP topographies, but also in the cortical source configuration. When compared to the NH listeners, the CI users showed an additional activation in the visual cortex at N1 latency, which was positively correlated with CI experience, and a delayed auditory-cortex activation with a reversed, rightward functional lateralisation. In sum, our behavioural and ERP findings demonstrate a clear audiovisual benefit for both groups, and a CI-specific alteration in cortical activation at N1 latency when auditory and visual input is combined. These cortical alterations may reflect a compensatory strategy to overcome the limited CI input, which allows the CI users to improve the lip-reading skills and to approximate the behavioural performance of NH listeners in audiovisual speech conditions. Our results are clinically relevant, as they highlight the importance of assessing the CI outcome not only in auditory-only, but also in audiovisual speech conditions. Elsevier 2022-03-04 /pmc/articles/PMC8927996/ /pubmed/35303598 http://dx.doi.org/10.1016/j.nicl.2022.102982 Text en © 2022 The Authors https://creativecommons.org/licenses/by-nc-nd/4.0/This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). |
spellingShingle | Regular Article Layer, Natalie Weglage, Anna Müller, Verena Meister, Hartmut Lang-Roth, Ruth Walger, Martin Murray, Micah M. Sandmann, Pascale The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs |
title | The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs |
title_full | The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs |
title_fullStr | The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs |
title_full_unstemmed | The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs |
title_short | The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs |
title_sort | timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by erps |
topic | Regular Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8927996/ https://www.ncbi.nlm.nih.gov/pubmed/35303598 http://dx.doi.org/10.1016/j.nicl.2022.102982 |
work_keys_str_mv | AT layernatalie thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT weglageanna thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT mullerverena thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT meisterhartmut thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT langrothruth thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT walgermartin thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT murraymicahm thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT sandmannpascale thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT layernatalie timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT weglageanna timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT mullerverena timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT meisterhartmut timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT langrothruth timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT walgermartin timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT murraymicahm timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps AT sandmannpascale timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps |