Cargando…

Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval

How we perceive and learn about our environment is influenced by our prior experiences and existing representations of the world. Top-down cognitive processes, such as attention and expectations, can alter how we process sensory stimuli, both within a modality (e.g., effects of auditory experience o...

Descripción completa

Detalles Bibliográficos
Autores principales: Marian, Viorica, Hayakawa, Sayuri, Schroeder, Scott R.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8350348/
https://www.ncbi.nlm.nih.gov/pubmed/34381328
http://dx.doi.org/10.3389/fnins.2021.661477
_version_ 1783735741396811776
author Marian, Viorica
Hayakawa, Sayuri
Schroeder, Scott R.
author_facet Marian, Viorica
Hayakawa, Sayuri
Schroeder, Scott R.
author_sort Marian, Viorica
collection PubMed
description How we perceive and learn about our environment is influenced by our prior experiences and existing representations of the world. Top-down cognitive processes, such as attention and expectations, can alter how we process sensory stimuli, both within a modality (e.g., effects of auditory experience on auditory perception), as well as across modalities (e.g., effects of visual feedback on sound localization). Here, we demonstrate that experience with different types of auditory input (spoken words vs. environmental sounds) modulates how humans remember concurrently-presented visual objects. Participants viewed a series of line drawings (e.g., picture of a cat) displayed in one of four quadrants while listening to a word or sound that was congruent (e.g., “cat” or <meow>), incongruent (e.g., “motorcycle” or <vroom–vroom>), or neutral (e.g., a meaningless pseudoword or a tonal beep) relative to the picture. Following the encoding phase, participants were presented with the original drawings plus new drawings and asked to indicate whether each one was “old” or “new.” If a drawing was designated as “old,” participants then reported where it had been displayed. We find that words and sounds both elicit more accurate memory for what objects were previously seen, but only congruent environmental sounds enhance memory for where objects were positioned – this, despite the fact that the auditory stimuli were not meaningful spatial cues of the objects’ locations on the screen. Given that during real-world listening conditions, environmental sounds, but not words, reliably originate from the location of their referents, listening to sounds may attune the visual dorsal pathway to facilitate attention and memory for objects’ locations. We propose that audio-visual associations in the environment and in our previous experience jointly contribute to visual memory, strengthening visual memory through exposure to auditory input.
format Online
Article
Text
id pubmed-8350348
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-83503482021-08-10 Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval Marian, Viorica Hayakawa, Sayuri Schroeder, Scott R. Front Neurosci Neuroscience How we perceive and learn about our environment is influenced by our prior experiences and existing representations of the world. Top-down cognitive processes, such as attention and expectations, can alter how we process sensory stimuli, both within a modality (e.g., effects of auditory experience on auditory perception), as well as across modalities (e.g., effects of visual feedback on sound localization). Here, we demonstrate that experience with different types of auditory input (spoken words vs. environmental sounds) modulates how humans remember concurrently-presented visual objects. Participants viewed a series of line drawings (e.g., picture of a cat) displayed in one of four quadrants while listening to a word or sound that was congruent (e.g., “cat” or <meow>), incongruent (e.g., “motorcycle” or <vroom–vroom>), or neutral (e.g., a meaningless pseudoword or a tonal beep) relative to the picture. Following the encoding phase, participants were presented with the original drawings plus new drawings and asked to indicate whether each one was “old” or “new.” If a drawing was designated as “old,” participants then reported where it had been displayed. We find that words and sounds both elicit more accurate memory for what objects were previously seen, but only congruent environmental sounds enhance memory for where objects were positioned – this, despite the fact that the auditory stimuli were not meaningful spatial cues of the objects’ locations on the screen. Given that during real-world listening conditions, environmental sounds, but not words, reliably originate from the location of their referents, listening to sounds may attune the visual dorsal pathway to facilitate attention and memory for objects’ locations. We propose that audio-visual associations in the environment and in our previous experience jointly contribute to visual memory, strengthening visual memory through exposure to auditory input. Frontiers Media S.A. 2021-07-26 /pmc/articles/PMC8350348/ /pubmed/34381328 http://dx.doi.org/10.3389/fnins.2021.661477 Text en Copyright © 2021 Marian, Hayakawa and Schroeder. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Marian, Viorica
Hayakawa, Sayuri
Schroeder, Scott R.
Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
title Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
title_full Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
title_fullStr Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
title_full_unstemmed Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
title_short Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
title_sort cross-modal interaction between auditory and visual input impacts memory retrieval
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8350348/
https://www.ncbi.nlm.nih.gov/pubmed/34381328
http://dx.doi.org/10.3389/fnins.2021.661477
work_keys_str_mv AT marianviorica crossmodalinteractionbetweenauditoryandvisualinputimpactsmemoryretrieval
AT hayakawasayuri crossmodalinteractionbetweenauditoryandvisualinputimpactsmemoryretrieval
AT schroederscottr crossmodalinteractionbetweenauditoryandvisualinputimpactsmemoryretrieval