Cargando…

Multiscale Enaction Model (MEM): the case of complexity and “context-sensitivity” in vision

I review the data on human visual perception that reveal the critical role played by non-visual contextual factors influencing visual activity. The global perspective that progressively emerges reveals that vision is sensitive to multiple couplings with other systems whose nature and levels of abstr...

Descripción completa

Detalles Bibliográficos
Autor principal: Laurent, Éric
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2014
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4271595/
https://www.ncbi.nlm.nih.gov/pubmed/25566115
http://dx.doi.org/10.3389/fpsyg.2014.01425
_version_ 1782349635636953088
author Laurent, Éric
author_facet Laurent, Éric
author_sort Laurent, Éric
collection PubMed
description I review the data on human visual perception that reveal the critical role played by non-visual contextual factors influencing visual activity. The global perspective that progressively emerges reveals that vision is sensitive to multiple couplings with other systems whose nature and levels of abstraction in science are highly variable. Contrary to some views where vision is immersed in modular hard-wired modules, rather independent from higher-level or other non-cognitive processes, converging data gathered in this article suggest that visual perception can be theorized in the larger context of biological, physical, and social systems with which it is coupled, and through which it is enacted. Therefore, any attempt to model complexity and multiscale couplings, or to develop a complex synthesis in the fields of mind, brain, and behavior, shall involve a systematic empirical study of both connectedness between systems or subsystems, and the embodied, multiscale and flexible teleology of subsystems. The conceptual model (Multiscale Enaction Model [MEM]) that is introduced in this paper finally relates empirical evidence gathered from psychology to biocomputational data concerning the human brain. Both psychological and biocomputational descriptions of MEM are proposed in order to help fill in the gap between scales of scientific analysis and to provide an account for both the autopoiesis-driven search for information, and emerging perception.
format Online
Article
Text
id pubmed-4271595
institution National Center for Biotechnology Information
language English
publishDate 2014
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-42715952015-01-06 Multiscale Enaction Model (MEM): the case of complexity and “context-sensitivity” in vision Laurent, Éric Front Psychol Psychology I review the data on human visual perception that reveal the critical role played by non-visual contextual factors influencing visual activity. The global perspective that progressively emerges reveals that vision is sensitive to multiple couplings with other systems whose nature and levels of abstraction in science are highly variable. Contrary to some views where vision is immersed in modular hard-wired modules, rather independent from higher-level or other non-cognitive processes, converging data gathered in this article suggest that visual perception can be theorized in the larger context of biological, physical, and social systems with which it is coupled, and through which it is enacted. Therefore, any attempt to model complexity and multiscale couplings, or to develop a complex synthesis in the fields of mind, brain, and behavior, shall involve a systematic empirical study of both connectedness between systems or subsystems, and the embodied, multiscale and flexible teleology of subsystems. The conceptual model (Multiscale Enaction Model [MEM]) that is introduced in this paper finally relates empirical evidence gathered from psychology to biocomputational data concerning the human brain. Both psychological and biocomputational descriptions of MEM are proposed in order to help fill in the gap between scales of scientific analysis and to provide an account for both the autopoiesis-driven search for information, and emerging perception. Frontiers Media S.A. 2014-12-19 /pmc/articles/PMC4271595/ /pubmed/25566115 http://dx.doi.org/10.3389/fpsyg.2014.01425 Text en Copyright © 2014 Laurent. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Laurent, Éric
Multiscale Enaction Model (MEM): the case of complexity and “context-sensitivity” in vision
title Multiscale Enaction Model (MEM): the case of complexity and “context-sensitivity” in vision
title_full Multiscale Enaction Model (MEM): the case of complexity and “context-sensitivity” in vision
title_fullStr Multiscale Enaction Model (MEM): the case of complexity and “context-sensitivity” in vision
title_full_unstemmed Multiscale Enaction Model (MEM): the case of complexity and “context-sensitivity” in vision
title_short Multiscale Enaction Model (MEM): the case of complexity and “context-sensitivity” in vision
title_sort multiscale enaction model (mem): the case of complexity and “context-sensitivity” in vision
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4271595/
https://www.ncbi.nlm.nih.gov/pubmed/25566115
http://dx.doi.org/10.3389/fpsyg.2014.01425
work_keys_str_mv AT laurenteric multiscaleenactionmodelmemthecaseofcomplexityandcontextsensitivityinvision