Cargando…

Intermodal event files: integrating features across vision, audition, taction, and action

Understanding how the human brain integrates features of perceived events calls for the examination of binding processes within and across different modalities and domains. Recent studies of feature-repetition effects have demonstrated interactions between shape, color, and location in the visual mo...

Descripción completa

Detalles Bibliográficos
Autores principales: Zmigrod, Sharon, Spapé, Michiel, Hommel, Bernhard
Formato: Texto
Lenguaje:English
Publicado: Springer-Verlag 2008
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2708333/
https://www.ncbi.nlm.nih.gov/pubmed/18836741
http://dx.doi.org/10.1007/s00426-008-0163-5
_version_ 1782169240880545792
author Zmigrod, Sharon
Spapé, Michiel
Hommel, Bernhard
author_facet Zmigrod, Sharon
Spapé, Michiel
Hommel, Bernhard
author_sort Zmigrod, Sharon
collection PubMed
description Understanding how the human brain integrates features of perceived events calls for the examination of binding processes within and across different modalities and domains. Recent studies of feature-repetition effects have demonstrated interactions between shape, color, and location in the visual modality and between pitch, loudness, and location in the auditory modality: repeating one feature is beneficial if other features are also repeated, but detrimental if not. These partial-repetition costs suggest that co-occurring features are spontaneously bound into temporary event files. Here, we investigated whether these observations can be extended to features from different sensory modalities, combining visual and auditory features in Experiment 1 and auditory and tactile features in Experiment 2. The same types of interactions, as for unimodal feature combinations, were obtained including interactions between stimulus and response features. However, the size of the interactions varied with the particular combination of features, suggesting that the salience of features and the temporal overlap between feature-code activations plays a mediating role.
format Text
id pubmed-2708333
institution National Center for Biotechnology Information
language English
publishDate 2008
publisher Springer-Verlag
record_format MEDLINE/PubMed
spelling pubmed-27083332009-07-10 Intermodal event files: integrating features across vision, audition, taction, and action Zmigrod, Sharon Spapé, Michiel Hommel, Bernhard Psychol Res Original Article Understanding how the human brain integrates features of perceived events calls for the examination of binding processes within and across different modalities and domains. Recent studies of feature-repetition effects have demonstrated interactions between shape, color, and location in the visual modality and between pitch, loudness, and location in the auditory modality: repeating one feature is beneficial if other features are also repeated, but detrimental if not. These partial-repetition costs suggest that co-occurring features are spontaneously bound into temporary event files. Here, we investigated whether these observations can be extended to features from different sensory modalities, combining visual and auditory features in Experiment 1 and auditory and tactile features in Experiment 2. The same types of interactions, as for unimodal feature combinations, were obtained including interactions between stimulus and response features. However, the size of the interactions varied with the particular combination of features, suggesting that the salience of features and the temporal overlap between feature-code activations plays a mediating role. Springer-Verlag 2008-10-03 2009-09 /pmc/articles/PMC2708333/ /pubmed/18836741 http://dx.doi.org/10.1007/s00426-008-0163-5 Text en © The Author(s) 2008
spellingShingle Original Article
Zmigrod, Sharon
Spapé, Michiel
Hommel, Bernhard
Intermodal event files: integrating features across vision, audition, taction, and action
title Intermodal event files: integrating features across vision, audition, taction, and action
title_full Intermodal event files: integrating features across vision, audition, taction, and action
title_fullStr Intermodal event files: integrating features across vision, audition, taction, and action
title_full_unstemmed Intermodal event files: integrating features across vision, audition, taction, and action
title_short Intermodal event files: integrating features across vision, audition, taction, and action
title_sort intermodal event files: integrating features across vision, audition, taction, and action
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2708333/
https://www.ncbi.nlm.nih.gov/pubmed/18836741
http://dx.doi.org/10.1007/s00426-008-0163-5
work_keys_str_mv AT zmigrodsharon intermodaleventfilesintegratingfeaturesacrossvisionauditiontactionandaction
AT spapemichiel intermodaleventfilesintegratingfeaturesacrossvisionauditiontactionandaction
AT hommelbernhard intermodaleventfilesintegratingfeaturesacrossvisionauditiontactionandaction