Cargando…
Interactionally Embedded Gestalt Principles of Multimodal Human Communication
Natural human interaction requires us to produce and process many different signals, including speech, hand and head gestures, and facial expressions. These communicative signals, which occur in a variety of temporal relations with each other (e.g., parallel or temporally misaligned), must be rapidl...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
SAGE Publications
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10475215/ https://www.ncbi.nlm.nih.gov/pubmed/36634318 http://dx.doi.org/10.1177/17456916221141422 |
_version_ | 1785100675816882176 |
---|---|
author | Trujillo, James P. Holler, Judith |
author_facet | Trujillo, James P. Holler, Judith |
author_sort | Trujillo, James P. |
collection | PubMed |
description | Natural human interaction requires us to produce and process many different signals, including speech, hand and head gestures, and facial expressions. These communicative signals, which occur in a variety of temporal relations with each other (e.g., parallel or temporally misaligned), must be rapidly processed as a coherent message by the receiver. In this contribution, we introduce the notion of interactionally embedded, affordance-driven gestalt perception as a framework that can explain how this rapid processing of multimodal signals is achieved as efficiently as it is. We discuss empirical evidence showing how basic principles of gestalt perception can explain some aspects of unimodal phenomena such as verbal language processing and visual scene perception but require additional features to explain multimodal human communication. We propose a framework in which high-level gestalt predictions are continuously updated by incoming sensory input, such as unfolding speech and visual signals. We outline the constituent processes that shape high-level gestalt perception and their role in perceiving relevance and prägnanz. Finally, we provide testable predictions that arise from this multimodal interactionally embedded gestalt-perception framework. This review and framework therefore provide a theoretically motivated account of how we may understand the highly complex, multimodal behaviors inherent in natural social interaction. |
format | Online Article Text |
id | pubmed-10475215 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | SAGE Publications |
record_format | MEDLINE/PubMed |
spelling | pubmed-104752152023-09-04 Interactionally Embedded Gestalt Principles of Multimodal Human Communication Trujillo, James P. Holler, Judith Perspect Psychol Sci Article Natural human interaction requires us to produce and process many different signals, including speech, hand and head gestures, and facial expressions. These communicative signals, which occur in a variety of temporal relations with each other (e.g., parallel or temporally misaligned), must be rapidly processed as a coherent message by the receiver. In this contribution, we introduce the notion of interactionally embedded, affordance-driven gestalt perception as a framework that can explain how this rapid processing of multimodal signals is achieved as efficiently as it is. We discuss empirical evidence showing how basic principles of gestalt perception can explain some aspects of unimodal phenomena such as verbal language processing and visual scene perception but require additional features to explain multimodal human communication. We propose a framework in which high-level gestalt predictions are continuously updated by incoming sensory input, such as unfolding speech and visual signals. We outline the constituent processes that shape high-level gestalt perception and their role in perceiving relevance and prägnanz. Finally, we provide testable predictions that arise from this multimodal interactionally embedded gestalt-perception framework. This review and framework therefore provide a theoretically motivated account of how we may understand the highly complex, multimodal behaviors inherent in natural social interaction. SAGE Publications 2023-01-12 2023-09 /pmc/articles/PMC10475215/ /pubmed/36634318 http://dx.doi.org/10.1177/17456916221141422 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/This article is distributed under the terms of the Creative Commons Attribution 4.0 License (https://creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/open-access-at-sage). |
spellingShingle | Article Trujillo, James P. Holler, Judith Interactionally Embedded Gestalt Principles of Multimodal Human Communication |
title | Interactionally Embedded Gestalt Principles of Multimodal Human Communication |
title_full | Interactionally Embedded Gestalt Principles of Multimodal Human Communication |
title_fullStr | Interactionally Embedded Gestalt Principles of Multimodal Human Communication |
title_full_unstemmed | Interactionally Embedded Gestalt Principles of Multimodal Human Communication |
title_short | Interactionally Embedded Gestalt Principles of Multimodal Human Communication |
title_sort | interactionally embedded gestalt principles of multimodal human communication |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10475215/ https://www.ncbi.nlm.nih.gov/pubmed/36634318 http://dx.doi.org/10.1177/17456916221141422 |
work_keys_str_mv | AT trujillojamesp interactionallyembeddedgestaltprinciplesofmultimodalhumancommunication AT hollerjudith interactionallyembeddedgestaltprinciplesofmultimodalhumancommunication |