Cargando…

Multisensory perception as an associative learning process

Suppose that you are at a live jazz show. The drummer begins a solo. You see the cymbal jolt and you hear the clang. But in addition seeing the cymbal jolt and hearing the clang, you are also aware that the jolt and the clang are part of the same event. Casey O’Callaghan (forthcoming) calls this awa...

Descripción completa

Detalles Bibliográficos
Autor principal: Connolly, Kevin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2014
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4176039/
https://www.ncbi.nlm.nih.gov/pubmed/25309498
http://dx.doi.org/10.3389/fpsyg.2014.01095
_version_ 1782336563406962688
author Connolly, Kevin
author_facet Connolly, Kevin
author_sort Connolly, Kevin
collection PubMed
description Suppose that you are at a live jazz show. The drummer begins a solo. You see the cymbal jolt and you hear the clang. But in addition seeing the cymbal jolt and hearing the clang, you are also aware that the jolt and the clang are part of the same event. Casey O’Callaghan (forthcoming) calls this awareness “intermodal feature binding awareness.” Psychologists have long assumed that multimodal perceptions such as this one are the result of a automatic feature binding mechanism (see Pourtois et al., 2000; Vatakis and Spence, 2007; Navarra et al., 2012). I present new evidence against this. I argue that there is no automatic feature binding mechanism that couples features like the jolt and the clang together. Instead, when you experience the jolt and the clang as part of the same event, this is the result of an associative learning process. The cymbal’s jolt and the clang are best understood as a single learned perceptual unit, rather than as automatically bound. I outline the specific learning process in perception called “unitization,” whereby we come to “chunk” the world into multimodal units. Unitization has never before been applied to multimodal cases. Yet I argue that this learning process can do the same work that intermodal binding would do, and that this issue has important philosophical implications. Specifically, whether we take multimodal cases to involve a binding mechanism or an associative process will have impact on philosophical issues from Molyneux’s question to the question of how active or passive we consider perception to be.
format Online
Article
Text
id pubmed-4176039
institution National Center for Biotechnology Information
language English
publishDate 2014
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-41760392014-10-10 Multisensory perception as an associative learning process Connolly, Kevin Front Psychol Psychology Suppose that you are at a live jazz show. The drummer begins a solo. You see the cymbal jolt and you hear the clang. But in addition seeing the cymbal jolt and hearing the clang, you are also aware that the jolt and the clang are part of the same event. Casey O’Callaghan (forthcoming) calls this awareness “intermodal feature binding awareness.” Psychologists have long assumed that multimodal perceptions such as this one are the result of a automatic feature binding mechanism (see Pourtois et al., 2000; Vatakis and Spence, 2007; Navarra et al., 2012). I present new evidence against this. I argue that there is no automatic feature binding mechanism that couples features like the jolt and the clang together. Instead, when you experience the jolt and the clang as part of the same event, this is the result of an associative learning process. The cymbal’s jolt and the clang are best understood as a single learned perceptual unit, rather than as automatically bound. I outline the specific learning process in perception called “unitization,” whereby we come to “chunk” the world into multimodal units. Unitization has never before been applied to multimodal cases. Yet I argue that this learning process can do the same work that intermodal binding would do, and that this issue has important philosophical implications. Specifically, whether we take multimodal cases to involve a binding mechanism or an associative process will have impact on philosophical issues from Molyneux’s question to the question of how active or passive we consider perception to be. Frontiers Media S.A. 2014-09-26 /pmc/articles/PMC4176039/ /pubmed/25309498 http://dx.doi.org/10.3389/fpsyg.2014.01095 Text en Copyright © 2014 Connolly. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Connolly, Kevin
Multisensory perception as an associative learning process
title Multisensory perception as an associative learning process
title_full Multisensory perception as an associative learning process
title_fullStr Multisensory perception as an associative learning process
title_full_unstemmed Multisensory perception as an associative learning process
title_short Multisensory perception as an associative learning process
title_sort multisensory perception as an associative learning process
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4176039/
https://www.ncbi.nlm.nih.gov/pubmed/25309498
http://dx.doi.org/10.3389/fpsyg.2014.01095
work_keys_str_mv AT connollykevin multisensoryperceptionasanassociativelearningprocess