Cargando…

Gaze-Direction-Based MEG Averaging During Audiovisual Speech Perception

To take a step towards real-life-like experimental setups, we simultaneously recorded magnetoencephalographic (MEG) signals and subject's gaze direction during audiovisual speech perception. The stimuli were utterances of /apa/ dubbed onto two side-by-side female faces articulating /apa/ (congr...

Descripción completa

Detalles Bibliográficos
Autores principales: Hirvenkari, Lotta, Jousmäki, Veikko, Lamminmäki, Satu, Saarinen, Veli-Matti, Sams, Mikko E., Hari, Riitta
Formato: Texto
Lenguaje:English
Publicado: Frontiers Research Foundation 2010
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2839848/
https://www.ncbi.nlm.nih.gov/pubmed/20300464
http://dx.doi.org/10.3389/fnhum.2010.00017
_version_ 1782178962878758912
author Hirvenkari, Lotta
Jousmäki, Veikko
Lamminmäki, Satu
Saarinen, Veli-Matti
Sams, Mikko E.
Hari, Riitta
author_facet Hirvenkari, Lotta
Jousmäki, Veikko
Lamminmäki, Satu
Saarinen, Veli-Matti
Sams, Mikko E.
Hari, Riitta
author_sort Hirvenkari, Lotta
collection PubMed
description To take a step towards real-life-like experimental setups, we simultaneously recorded magnetoencephalographic (MEG) signals and subject's gaze direction during audiovisual speech perception. The stimuli were utterances of /apa/ dubbed onto two side-by-side female faces articulating /apa/ (congruent) and /aka/ (incongruent) in synchrony, repeated once every 3 s. Subjects (N = 10) were free to decide which face they viewed, and responses were averaged to two categories according to the gaze direction. The right-hemisphere 100-ms response to the onset of the second vowel (N100m’) was a fifth smaller to incongruent than congruent stimuli. The results demonstrate the feasibility of realistic viewing conditions with gaze-based averaging of MEG signals.
format Text
id pubmed-2839848
institution National Center for Biotechnology Information
language English
publishDate 2010
publisher Frontiers Research Foundation
record_format MEDLINE/PubMed
spelling pubmed-28398482010-03-17 Gaze-Direction-Based MEG Averaging During Audiovisual Speech Perception Hirvenkari, Lotta Jousmäki, Veikko Lamminmäki, Satu Saarinen, Veli-Matti Sams, Mikko E. Hari, Riitta Front Hum Neurosci Neuroscience To take a step towards real-life-like experimental setups, we simultaneously recorded magnetoencephalographic (MEG) signals and subject's gaze direction during audiovisual speech perception. The stimuli were utterances of /apa/ dubbed onto two side-by-side female faces articulating /apa/ (congruent) and /aka/ (incongruent) in synchrony, repeated once every 3 s. Subjects (N = 10) were free to decide which face they viewed, and responses were averaged to two categories according to the gaze direction. The right-hemisphere 100-ms response to the onset of the second vowel (N100m’) was a fifth smaller to incongruent than congruent stimuli. The results demonstrate the feasibility of realistic viewing conditions with gaze-based averaging of MEG signals. Frontiers Research Foundation 2010-03-08 /pmc/articles/PMC2839848/ /pubmed/20300464 http://dx.doi.org/10.3389/fnhum.2010.00017 Text en Copyright © 2010 Hirvenkari, Jousmäki, Lamminmäki, Saarinen, Sams and Hari. http://www.frontiersin.org/licenseagreement This is an open-access article subject to an exclusive license agreement between the authors and the Frontiers Research Foundation, which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and source are credited.
spellingShingle Neuroscience
Hirvenkari, Lotta
Jousmäki, Veikko
Lamminmäki, Satu
Saarinen, Veli-Matti
Sams, Mikko E.
Hari, Riitta
Gaze-Direction-Based MEG Averaging During Audiovisual Speech Perception
title Gaze-Direction-Based MEG Averaging During Audiovisual Speech Perception
title_full Gaze-Direction-Based MEG Averaging During Audiovisual Speech Perception
title_fullStr Gaze-Direction-Based MEG Averaging During Audiovisual Speech Perception
title_full_unstemmed Gaze-Direction-Based MEG Averaging During Audiovisual Speech Perception
title_short Gaze-Direction-Based MEG Averaging During Audiovisual Speech Perception
title_sort gaze-direction-based meg averaging during audiovisual speech perception
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2839848/
https://www.ncbi.nlm.nih.gov/pubmed/20300464
http://dx.doi.org/10.3389/fnhum.2010.00017
work_keys_str_mv AT hirvenkarilotta gazedirectionbasedmegaveragingduringaudiovisualspeechperception
AT jousmakiveikko gazedirectionbasedmegaveragingduringaudiovisualspeechperception
AT lamminmakisatu gazedirectionbasedmegaveragingduringaudiovisualspeechperception
AT saarinenvelimatti gazedirectionbasedmegaveragingduringaudiovisualspeechperception
AT samsmikkoe gazedirectionbasedmegaveragingduringaudiovisualspeechperception
AT haririitta gazedirectionbasedmegaveragingduringaudiovisualspeechperception