Cargando…
Testing Simulation Theory with Cross-Modal Multivariate Classification of fMRI Data
The discovery of mirror neurons has suggested a potential neural basis for simulation and common coding theories of action perception, theories which propose that we understand other people's actions because perceiving their actions activates some of our neurons in much the same way as when we...
Autores principales: | , , |
---|---|
Formato: | Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2008
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2577733/ https://www.ncbi.nlm.nih.gov/pubmed/18997869 http://dx.doi.org/10.1371/journal.pone.0003690 |
_version_ | 1782160508902703104 |
---|---|
author | Etzel, Joset A. Gazzola, Valeria Keysers, Christian |
author_facet | Etzel, Joset A. Gazzola, Valeria Keysers, Christian |
author_sort | Etzel, Joset A. |
collection | PubMed |
description | The discovery of mirror neurons has suggested a potential neural basis for simulation and common coding theories of action perception, theories which propose that we understand other people's actions because perceiving their actions activates some of our neurons in much the same way as when we perform the actions. We propose testing this model directly in humans with functional magnetic resonance imaging (fMRI) by means of cross-modal classification. Cross-modal classification evaluates whether a classifier that has learned to separate stimuli in the sensory domain can also separate the stimuli in the motor domain. Successful classification provides support for simulation theories because it means that the fMRI signal, and presumably brain activity, is similar when perceiving and performing actions. In this paper we demonstrate the feasibility of the technique by showing that classifiers which have learned to discriminate whether a participant heard a hand or a mouth action, based on the activity patterns in the premotor cortex, can also determine, without additional training, whether the participant executed a hand or mouth action. This provides direct evidence that, while perceiving others' actions, (1) the pattern of activity in premotor voxels with sensory properties is a significant source of information regarding the nature of these actions, and (2) that this information shares a common code with motor execution. |
format | Text |
id | pubmed-2577733 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2008 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-25777332008-11-10 Testing Simulation Theory with Cross-Modal Multivariate Classification of fMRI Data Etzel, Joset A. Gazzola, Valeria Keysers, Christian PLoS One Research Article The discovery of mirror neurons has suggested a potential neural basis for simulation and common coding theories of action perception, theories which propose that we understand other people's actions because perceiving their actions activates some of our neurons in much the same way as when we perform the actions. We propose testing this model directly in humans with functional magnetic resonance imaging (fMRI) by means of cross-modal classification. Cross-modal classification evaluates whether a classifier that has learned to separate stimuli in the sensory domain can also separate the stimuli in the motor domain. Successful classification provides support for simulation theories because it means that the fMRI signal, and presumably brain activity, is similar when perceiving and performing actions. In this paper we demonstrate the feasibility of the technique by showing that classifiers which have learned to discriminate whether a participant heard a hand or a mouth action, based on the activity patterns in the premotor cortex, can also determine, without additional training, whether the participant executed a hand or mouth action. This provides direct evidence that, while perceiving others' actions, (1) the pattern of activity in premotor voxels with sensory properties is a significant source of information regarding the nature of these actions, and (2) that this information shares a common code with motor execution. Public Library of Science 2008-11-10 /pmc/articles/PMC2577733/ /pubmed/18997869 http://dx.doi.org/10.1371/journal.pone.0003690 Text en Etzel et al. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited. |
spellingShingle | Research Article Etzel, Joset A. Gazzola, Valeria Keysers, Christian Testing Simulation Theory with Cross-Modal Multivariate Classification of fMRI Data |
title | Testing Simulation Theory with Cross-Modal Multivariate Classification of fMRI Data |
title_full | Testing Simulation Theory with Cross-Modal Multivariate Classification of fMRI Data |
title_fullStr | Testing Simulation Theory with Cross-Modal Multivariate Classification of fMRI Data |
title_full_unstemmed | Testing Simulation Theory with Cross-Modal Multivariate Classification of fMRI Data |
title_short | Testing Simulation Theory with Cross-Modal Multivariate Classification of fMRI Data |
title_sort | testing simulation theory with cross-modal multivariate classification of fmri data |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2577733/ https://www.ncbi.nlm.nih.gov/pubmed/18997869 http://dx.doi.org/10.1371/journal.pone.0003690 |
work_keys_str_mv | AT etzeljoseta testingsimulationtheorywithcrossmodalmultivariateclassificationoffmridata AT gazzolavaleria testingsimulationtheorywithcrossmodalmultivariateclassificationoffmridata AT keyserschristian testingsimulationtheorywithcrossmodalmultivariateclassificationoffmridata |