Cargando…
The effect of gaze on EEG measures of multisensory integration in a cocktail party scenario
Seeing the speaker's face greatly improves our speech comprehension in noisy environments. This is due to the brain's ability to combine the auditory and the visual information around us, a process known as multisensory integration. Selective attention also strongly influences what we comp...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Cold Spring Harbor Laboratory
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10473711/ https://www.ncbi.nlm.nih.gov/pubmed/37662393 http://dx.doi.org/10.1101/2023.08.23.554451 |
_version_ | 1785100324748394496 |
---|---|
author | Ahmed, Farhin Nidiffer, Aaron R. Lalor, Edmund C. |
author_facet | Ahmed, Farhin Nidiffer, Aaron R. Lalor, Edmund C. |
author_sort | Ahmed, Farhin |
collection | PubMed |
description | Seeing the speaker's face greatly improves our speech comprehension in noisy environments. This is due to the brain's ability to combine the auditory and the visual information around us, a process known as multisensory integration. Selective attention also strongly influences what we comprehend in scenarios with multiple speakers – an effect known as the cocktail-party phenomenon. However, the interaction between attention and multisensory integration is not fully understood, especially when it comes to natural, continuous speech. In a recent electroencephalography (EEG) study, we explored this issue and showed that multisensory integration is enhanced when an audiovisual speaker is attended compared to when that speaker is unattended. Here, we extend that work to investigate how this interaction varies depending on a person’s gaze behavior, which affects the quality of the visual information they have access to. To do so, we recorded EEG from 31 healthy adults as they performed selective attention tasks in several paradigms involving two concurrently presented audiovisual speakers. We then modeled how the recorded EEG related to the audio speech (envelope) of the presented speakers. Crucially, we compared two classes of model – one that assumed underlying multisensory integration (AV) versus another that assumed two independent unisensory audio and visual processes (A+V). This comparison revealed evidence of strong attentional effects on multisensory integration when participants were looking directly at the face of an audiovisual speaker. This effect was not apparent when the speaker’s face was in the peripheral vision of the participants. Overall, our findings suggest a strong influence of attention on multisensory integration when high fidelity visual (articulatory) speech information is available. More generally, this suggests that the interplay between attention and multisensory integration during natural audiovisual speech is dynamic and is adaptable based on the specific task and environment. |
format | Online Article Text |
id | pubmed-10473711 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Cold Spring Harbor Laboratory |
record_format | MEDLINE/PubMed |
spelling | pubmed-104737112023-09-02 The effect of gaze on EEG measures of multisensory integration in a cocktail party scenario Ahmed, Farhin Nidiffer, Aaron R. Lalor, Edmund C. bioRxiv Article Seeing the speaker's face greatly improves our speech comprehension in noisy environments. This is due to the brain's ability to combine the auditory and the visual information around us, a process known as multisensory integration. Selective attention also strongly influences what we comprehend in scenarios with multiple speakers – an effect known as the cocktail-party phenomenon. However, the interaction between attention and multisensory integration is not fully understood, especially when it comes to natural, continuous speech. In a recent electroencephalography (EEG) study, we explored this issue and showed that multisensory integration is enhanced when an audiovisual speaker is attended compared to when that speaker is unattended. Here, we extend that work to investigate how this interaction varies depending on a person’s gaze behavior, which affects the quality of the visual information they have access to. To do so, we recorded EEG from 31 healthy adults as they performed selective attention tasks in several paradigms involving two concurrently presented audiovisual speakers. We then modeled how the recorded EEG related to the audio speech (envelope) of the presented speakers. Crucially, we compared two classes of model – one that assumed underlying multisensory integration (AV) versus another that assumed two independent unisensory audio and visual processes (A+V). This comparison revealed evidence of strong attentional effects on multisensory integration when participants were looking directly at the face of an audiovisual speaker. This effect was not apparent when the speaker’s face was in the peripheral vision of the participants. Overall, our findings suggest a strong influence of attention on multisensory integration when high fidelity visual (articulatory) speech information is available. More generally, this suggests that the interplay between attention and multisensory integration during natural audiovisual speech is dynamic and is adaptable based on the specific task and environment. Cold Spring Harbor Laboratory 2023-08-24 /pmc/articles/PMC10473711/ /pubmed/37662393 http://dx.doi.org/10.1101/2023.08.23.554451 Text en https://creativecommons.org/licenses/by/4.0/This work is licensed under a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/) , which allows reusers to distribute, remix, adapt, and build upon the material in any medium or format, so long as attribution is given to the creator. The license allows for commercial use. |
spellingShingle | Article Ahmed, Farhin Nidiffer, Aaron R. Lalor, Edmund C. The effect of gaze on EEG measures of multisensory integration in a cocktail party scenario |
title | The effect of gaze on EEG measures of multisensory integration in a cocktail party scenario |
title_full | The effect of gaze on EEG measures of multisensory integration in a cocktail party scenario |
title_fullStr | The effect of gaze on EEG measures of multisensory integration in a cocktail party scenario |
title_full_unstemmed | The effect of gaze on EEG measures of multisensory integration in a cocktail party scenario |
title_short | The effect of gaze on EEG measures of multisensory integration in a cocktail party scenario |
title_sort | effect of gaze on eeg measures of multisensory integration in a cocktail party scenario |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10473711/ https://www.ncbi.nlm.nih.gov/pubmed/37662393 http://dx.doi.org/10.1101/2023.08.23.554451 |
work_keys_str_mv | AT ahmedfarhin theeffectofgazeoneegmeasuresofmultisensoryintegrationinacocktailpartyscenario AT nidifferaaronr theeffectofgazeoneegmeasuresofmultisensoryintegrationinacocktailpartyscenario AT laloredmundc theeffectofgazeoneegmeasuresofmultisensoryintegrationinacocktailpartyscenario AT ahmedfarhin effectofgazeoneegmeasuresofmultisensoryintegrationinacocktailpartyscenario AT nidifferaaronr effectofgazeoneegmeasuresofmultisensoryintegrationinacocktailpartyscenario AT laloredmundc effectofgazeoneegmeasuresofmultisensoryintegrationinacocktailpartyscenario |