Cargando…

Examining the Role of Eye Movements During Conversational Listening in Noise

Speech comprehension is often thought of as an entirely auditory process, but both normal hearing and hearing-impaired individuals sometimes use visual attention to disambiguate speech, particularly when it is difficult to hear. Many studies have investigated how visual attention (or the lack thereo...

Descripción completa

Detalles Bibliográficos
Autores principales: Šabić, Edin, Henning, Daniel, Myüz, Hunter, Morrow, Audrey, Hout, Michael C., MacDonald, Justin A.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7033431/
https://www.ncbi.nlm.nih.gov/pubmed/32116975
http://dx.doi.org/10.3389/fpsyg.2020.00200
_version_ 1783499665542479872
author Šabić, Edin
Henning, Daniel
Myüz, Hunter
Morrow, Audrey
Hout, Michael C.
MacDonald, Justin A.
author_facet Šabić, Edin
Henning, Daniel
Myüz, Hunter
Morrow, Audrey
Hout, Michael C.
MacDonald, Justin A.
author_sort Šabić, Edin
collection PubMed
description Speech comprehension is often thought of as an entirely auditory process, but both normal hearing and hearing-impaired individuals sometimes use visual attention to disambiguate speech, particularly when it is difficult to hear. Many studies have investigated how visual attention (or the lack thereof) impacts the perception of simple speech sounds such as isolated consonants, but there is a gap in the literature concerning visual attention during natural speech comprehension. This issue needs to be addressed, as individuals process sounds and words in everyday speech differently than when they are separated into individual elements with no competing sound sources or noise. Moreover, further research is needed to explore patterns of eye movements during speech comprehension – especially in the presence of noise – as such an investigation would allow us to better understand how people strategically use visual information while processing speech. To this end, we conducted an experiment to track eye-gaze behavior during a series of listening tasks as a function of the number of speakers, background noise intensity, and the presence or absence of simulated hearing impairment. Our specific aims were to discover how individuals might adapt their oculomotor behavior to compensate for the difficulty of the listening scenario, such as when listening in noisy environments or experiencing simulated hearing loss. Speech comprehension difficulty was manipulated by simulating hearing loss and varying background noise intensity. Results showed that eye movements were affected by the number of speakers, simulated hearing impairment, and the presence of noise. Further, findings showed that differing levels of signal-to-noise ratio (SNR) led to changes in eye-gaze behavior. Most notably, we found that the addition of visual information (i.e. videos vs. auditory information only) led to enhanced speech comprehension – highlighting the strategic usage of visual information during this process.
format Online
Article
Text
id pubmed-7033431
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-70334312020-02-28 Examining the Role of Eye Movements During Conversational Listening in Noise Šabić, Edin Henning, Daniel Myüz, Hunter Morrow, Audrey Hout, Michael C. MacDonald, Justin A. Front Psychol Psychology Speech comprehension is often thought of as an entirely auditory process, but both normal hearing and hearing-impaired individuals sometimes use visual attention to disambiguate speech, particularly when it is difficult to hear. Many studies have investigated how visual attention (or the lack thereof) impacts the perception of simple speech sounds such as isolated consonants, but there is a gap in the literature concerning visual attention during natural speech comprehension. This issue needs to be addressed, as individuals process sounds and words in everyday speech differently than when they are separated into individual elements with no competing sound sources or noise. Moreover, further research is needed to explore patterns of eye movements during speech comprehension – especially in the presence of noise – as such an investigation would allow us to better understand how people strategically use visual information while processing speech. To this end, we conducted an experiment to track eye-gaze behavior during a series of listening tasks as a function of the number of speakers, background noise intensity, and the presence or absence of simulated hearing impairment. Our specific aims were to discover how individuals might adapt their oculomotor behavior to compensate for the difficulty of the listening scenario, such as when listening in noisy environments or experiencing simulated hearing loss. Speech comprehension difficulty was manipulated by simulating hearing loss and varying background noise intensity. Results showed that eye movements were affected by the number of speakers, simulated hearing impairment, and the presence of noise. Further, findings showed that differing levels of signal-to-noise ratio (SNR) led to changes in eye-gaze behavior. Most notably, we found that the addition of visual information (i.e. videos vs. auditory information only) led to enhanced speech comprehension – highlighting the strategic usage of visual information during this process. Frontiers Media S.A. 2020-02-14 /pmc/articles/PMC7033431/ /pubmed/32116975 http://dx.doi.org/10.3389/fpsyg.2020.00200 Text en Copyright © 2020 Šabić, Henning, Myüz, Morrow, Hout and MacDonald. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Šabić, Edin
Henning, Daniel
Myüz, Hunter
Morrow, Audrey
Hout, Michael C.
MacDonald, Justin A.
Examining the Role of Eye Movements During Conversational Listening in Noise
title Examining the Role of Eye Movements During Conversational Listening in Noise
title_full Examining the Role of Eye Movements During Conversational Listening in Noise
title_fullStr Examining the Role of Eye Movements During Conversational Listening in Noise
title_full_unstemmed Examining the Role of Eye Movements During Conversational Listening in Noise
title_short Examining the Role of Eye Movements During Conversational Listening in Noise
title_sort examining the role of eye movements during conversational listening in noise
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7033431/
https://www.ncbi.nlm.nih.gov/pubmed/32116975
http://dx.doi.org/10.3389/fpsyg.2020.00200
work_keys_str_mv AT sabicedin examiningtheroleofeyemovementsduringconversationallisteninginnoise
AT henningdaniel examiningtheroleofeyemovementsduringconversationallisteninginnoise
AT myuzhunter examiningtheroleofeyemovementsduringconversationallisteninginnoise
AT morrowaudrey examiningtheroleofeyemovementsduringconversationallisteninginnoise
AT houtmichaelc examiningtheroleofeyemovementsduringconversationallisteninginnoise
AT macdonaldjustina examiningtheroleofeyemovementsduringconversationallisteninginnoise