Cargando…

Considerations in Audio-Visual Interaction Models: An ERP Study of Music Perception by Musicians and Non-musicians

Previous research with speech and non-speech stimuli suggested that in audiovisual perception, visual information starting prior to the onset of corresponding sound can provide visual cues, and form a prediction about the upcoming auditory sound. This prediction leads to audiovisual (AV) interaction...

Descripción completa

Detalles Bibliográficos
Autores principales: Sorati, Marzieh, Behne, Dawn M.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7854916/
https://www.ncbi.nlm.nih.gov/pubmed/33551911
http://dx.doi.org/10.3389/fpsyg.2020.594434
_version_ 1783646161059446784
author Sorati, Marzieh
Behne, Dawn M.
author_facet Sorati, Marzieh
Behne, Dawn M.
author_sort Sorati, Marzieh
collection PubMed
description Previous research with speech and non-speech stimuli suggested that in audiovisual perception, visual information starting prior to the onset of corresponding sound can provide visual cues, and form a prediction about the upcoming auditory sound. This prediction leads to audiovisual (AV) interaction. Auditory and visual perception interact and induce suppression and speeding up of the early auditory event-related potentials (ERPs) such as N1 and P2. To investigate AV interaction, previous research examined N1 and P2 amplitudes and latencies in response to audio only (AO), video only (VO), audiovisual, and control (CO) stimuli, and compared AV with auditory perception based on four AV interaction models (AV vs. AO+VO, AV-VO vs. AO, AV-VO vs. AO-CO, AV vs. AO). The current study addresses how different models of AV interaction express N1 and P2 suppression in music perception. Furthermore, the current study took one step further and examined whether previous musical experience, which can potentially lead to higher N1 and P2 amplitudes in auditory perception, influenced AV interaction in different models. Musicians and non-musicians were presented the recordings (AO, AV, VO) of a keyboard /C4/ key being played, as well as CO stimuli. Results showed that AV interaction models differ in their expression of N1 and P2 amplitude and latency suppression. The calculation of model (AV-VO vs. AO) and (AV-VO vs. AO-CO) has consequences for the resulting N1 and P2 difference waves. Furthermore, while musicians, compared to non-musicians, showed higher N1 amplitude in auditory perception, suppression of amplitudes and latencies for N1 and P2 was similar for the two groups across the AV models. Collectively, these results suggest that when visual cues from finger and hand movements predict the upcoming sound in AV music perception, suppression of early ERPs is similar for musicians and non-musicians. Notably, the calculation differences across models do not lead to the same pattern of results for N1 and P2, demonstrating that the four models are not interchangeable and are not directly comparable.
format Online
Article
Text
id pubmed-7854916
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-78549162021-02-04 Considerations in Audio-Visual Interaction Models: An ERP Study of Music Perception by Musicians and Non-musicians Sorati, Marzieh Behne, Dawn M. Front Psychol Psychology Previous research with speech and non-speech stimuli suggested that in audiovisual perception, visual information starting prior to the onset of corresponding sound can provide visual cues, and form a prediction about the upcoming auditory sound. This prediction leads to audiovisual (AV) interaction. Auditory and visual perception interact and induce suppression and speeding up of the early auditory event-related potentials (ERPs) such as N1 and P2. To investigate AV interaction, previous research examined N1 and P2 amplitudes and latencies in response to audio only (AO), video only (VO), audiovisual, and control (CO) stimuli, and compared AV with auditory perception based on four AV interaction models (AV vs. AO+VO, AV-VO vs. AO, AV-VO vs. AO-CO, AV vs. AO). The current study addresses how different models of AV interaction express N1 and P2 suppression in music perception. Furthermore, the current study took one step further and examined whether previous musical experience, which can potentially lead to higher N1 and P2 amplitudes in auditory perception, influenced AV interaction in different models. Musicians and non-musicians were presented the recordings (AO, AV, VO) of a keyboard /C4/ key being played, as well as CO stimuli. Results showed that AV interaction models differ in their expression of N1 and P2 amplitude and latency suppression. The calculation of model (AV-VO vs. AO) and (AV-VO vs. AO-CO) has consequences for the resulting N1 and P2 difference waves. Furthermore, while musicians, compared to non-musicians, showed higher N1 amplitude in auditory perception, suppression of amplitudes and latencies for N1 and P2 was similar for the two groups across the AV models. Collectively, these results suggest that when visual cues from finger and hand movements predict the upcoming sound in AV music perception, suppression of early ERPs is similar for musicians and non-musicians. Notably, the calculation differences across models do not lead to the same pattern of results for N1 and P2, demonstrating that the four models are not interchangeable and are not directly comparable. Frontiers Media S.A. 2021-01-20 /pmc/articles/PMC7854916/ /pubmed/33551911 http://dx.doi.org/10.3389/fpsyg.2020.594434 Text en Copyright © 2021 Sorati and Behne. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Sorati, Marzieh
Behne, Dawn M.
Considerations in Audio-Visual Interaction Models: An ERP Study of Music Perception by Musicians and Non-musicians
title Considerations in Audio-Visual Interaction Models: An ERP Study of Music Perception by Musicians and Non-musicians
title_full Considerations in Audio-Visual Interaction Models: An ERP Study of Music Perception by Musicians and Non-musicians
title_fullStr Considerations in Audio-Visual Interaction Models: An ERP Study of Music Perception by Musicians and Non-musicians
title_full_unstemmed Considerations in Audio-Visual Interaction Models: An ERP Study of Music Perception by Musicians and Non-musicians
title_short Considerations in Audio-Visual Interaction Models: An ERP Study of Music Perception by Musicians and Non-musicians
title_sort considerations in audio-visual interaction models: an erp study of music perception by musicians and non-musicians
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7854916/
https://www.ncbi.nlm.nih.gov/pubmed/33551911
http://dx.doi.org/10.3389/fpsyg.2020.594434
work_keys_str_mv AT soratimarzieh considerationsinaudiovisualinteractionmodelsanerpstudyofmusicperceptionbymusiciansandnonmusicians
AT behnedawnm considerationsinaudiovisualinteractionmodelsanerpstudyofmusicperceptionbymusiciansandnonmusicians