Cargando…
Manual Gestures Modulate Early Neural Responses in Loudness Perception
How different sensory modalities interact to shape perception is a fundamental question in cognitive neuroscience. Previous studies in audiovisual interaction have focused on abstract levels such as categorical representation (e.g., McGurk effect). It is unclear whether the cross-modal modulation ca...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8440995/ https://www.ncbi.nlm.nih.gov/pubmed/34539324 http://dx.doi.org/10.3389/fnins.2021.634967 |
_version_ | 1783752785386274816 |
---|---|
author | Sun, Jiaqiu Wang, Ziqing Tian, Xing |
author_facet | Sun, Jiaqiu Wang, Ziqing Tian, Xing |
author_sort | Sun, Jiaqiu |
collection | PubMed |
description | How different sensory modalities interact to shape perception is a fundamental question in cognitive neuroscience. Previous studies in audiovisual interaction have focused on abstract levels such as categorical representation (e.g., McGurk effect). It is unclear whether the cross-modal modulation can extend to low-level perceptual attributes. This study used motional manual gestures to test whether and how the loudness perception can be modulated by visual-motion information. Specifically, we implemented a novel paradigm in which participants compared the loudness of two consecutive sounds whose intensity changes around the just noticeable difference (JND), with manual gestures concurrently presented with the second sound. In two behavioral experiments and two EEG experiments, we investigated our hypothesis that the visual-motor information in gestures would modulate loudness perception. Behavioral results showed that the gestural information biased the judgment of loudness. More importantly, the EEG results demonstrated that early auditory responses around 100 ms after sound onset (N100) were modulated by the gestures. These consistent results in four behavioral and EEG experiments suggest that visual-motor processing can integrate with auditory processing at an early perceptual stage to shape the perception of a low-level perceptual attribute such as loudness, at least under challenging listening conditions. |
format | Online Article Text |
id | pubmed-8440995 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-84409952021-09-16 Manual Gestures Modulate Early Neural Responses in Loudness Perception Sun, Jiaqiu Wang, Ziqing Tian, Xing Front Neurosci Neuroscience How different sensory modalities interact to shape perception is a fundamental question in cognitive neuroscience. Previous studies in audiovisual interaction have focused on abstract levels such as categorical representation (e.g., McGurk effect). It is unclear whether the cross-modal modulation can extend to low-level perceptual attributes. This study used motional manual gestures to test whether and how the loudness perception can be modulated by visual-motion information. Specifically, we implemented a novel paradigm in which participants compared the loudness of two consecutive sounds whose intensity changes around the just noticeable difference (JND), with manual gestures concurrently presented with the second sound. In two behavioral experiments and two EEG experiments, we investigated our hypothesis that the visual-motor information in gestures would modulate loudness perception. Behavioral results showed that the gestural information biased the judgment of loudness. More importantly, the EEG results demonstrated that early auditory responses around 100 ms after sound onset (N100) were modulated by the gestures. These consistent results in four behavioral and EEG experiments suggest that visual-motor processing can integrate with auditory processing at an early perceptual stage to shape the perception of a low-level perceptual attribute such as loudness, at least under challenging listening conditions. Frontiers Media S.A. 2021-09-01 /pmc/articles/PMC8440995/ /pubmed/34539324 http://dx.doi.org/10.3389/fnins.2021.634967 Text en Copyright © 2021 Sun, Wang and Tian. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Sun, Jiaqiu Wang, Ziqing Tian, Xing Manual Gestures Modulate Early Neural Responses in Loudness Perception |
title | Manual Gestures Modulate Early Neural Responses in Loudness Perception |
title_full | Manual Gestures Modulate Early Neural Responses in Loudness Perception |
title_fullStr | Manual Gestures Modulate Early Neural Responses in Loudness Perception |
title_full_unstemmed | Manual Gestures Modulate Early Neural Responses in Loudness Perception |
title_short | Manual Gestures Modulate Early Neural Responses in Loudness Perception |
title_sort | manual gestures modulate early neural responses in loudness perception |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8440995/ https://www.ncbi.nlm.nih.gov/pubmed/34539324 http://dx.doi.org/10.3389/fnins.2021.634967 |
work_keys_str_mv | AT sunjiaqiu manualgesturesmodulateearlyneuralresponsesinloudnessperception AT wangziqing manualgesturesmodulateearlyneuralresponsesinloudnessperception AT tianxing manualgesturesmodulateearlyneuralresponsesinloudnessperception |