Cargando…

Remarks on Multimodality: Grammatical Interactions in the Parallel Architecture

Language is typically embedded in multimodal communication, yet models of linguistic competence do not often incorporate this complexity. Meanwhile, speech, gesture, and/or pictures are each considered as indivisible components of multimodal messages. Here, we argue that multimodality should not be...

Descripción completa

Detalles Bibliográficos
Autores principales: Cohn, Neil, Schilperoord, Joost
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8764459/
https://www.ncbi.nlm.nih.gov/pubmed/35059636
http://dx.doi.org/10.3389/frai.2021.778060
_version_ 1784634170846216192
author Cohn, Neil
Schilperoord, Joost
author_facet Cohn, Neil
Schilperoord, Joost
author_sort Cohn, Neil
collection PubMed
description Language is typically embedded in multimodal communication, yet models of linguistic competence do not often incorporate this complexity. Meanwhile, speech, gesture, and/or pictures are each considered as indivisible components of multimodal messages. Here, we argue that multimodality should not be characterized by whole interacting behaviors, but by interactions of similar substructures which permeate across expressive behaviors. These structures comprise a unified architecture and align within Jackendoff's Parallel Architecture: a modality, meaning, and grammar. Because this tripartite architecture persists across modalities, interactions can manifest within each of these substructures. Interactions between modalities alone create correspondences in time (ex. speech with gesture) or space (ex. writing with pictures) of the sensory signals, while multimodal meaning-making balances how modalities carry “semantic weight” for the gist of the whole expression. Here we focus primarily on interactions between grammars, which contrast across two variables: symmetry, related to the complexity of the grammars, and allocation, related to the relative independence of interacting grammars. While independent allocations keep grammars separate, substitutive allocation inserts expressions from one grammar into those of another. We show that substitution operates in interactions between all three natural modalities (vocal, bodily, graphic), and also in unimodal contexts within and between languages, as in codeswitching. Altogether, we argue that unimodal and multimodal expressions arise as emergent interactive states from a unified cognitive architecture, heralding a reconsideration of the “language faculty” itself.
format Online
Article
Text
id pubmed-8764459
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-87644592022-01-19 Remarks on Multimodality: Grammatical Interactions in the Parallel Architecture Cohn, Neil Schilperoord, Joost Front Artif Intell Artificial Intelligence Language is typically embedded in multimodal communication, yet models of linguistic competence do not often incorporate this complexity. Meanwhile, speech, gesture, and/or pictures are each considered as indivisible components of multimodal messages. Here, we argue that multimodality should not be characterized by whole interacting behaviors, but by interactions of similar substructures which permeate across expressive behaviors. These structures comprise a unified architecture and align within Jackendoff's Parallel Architecture: a modality, meaning, and grammar. Because this tripartite architecture persists across modalities, interactions can manifest within each of these substructures. Interactions between modalities alone create correspondences in time (ex. speech with gesture) or space (ex. writing with pictures) of the sensory signals, while multimodal meaning-making balances how modalities carry “semantic weight” for the gist of the whole expression. Here we focus primarily on interactions between grammars, which contrast across two variables: symmetry, related to the complexity of the grammars, and allocation, related to the relative independence of interacting grammars. While independent allocations keep grammars separate, substitutive allocation inserts expressions from one grammar into those of another. We show that substitution operates in interactions between all three natural modalities (vocal, bodily, graphic), and also in unimodal contexts within and between languages, as in codeswitching. Altogether, we argue that unimodal and multimodal expressions arise as emergent interactive states from a unified cognitive architecture, heralding a reconsideration of the “language faculty” itself. Frontiers Media S.A. 2022-01-04 /pmc/articles/PMC8764459/ /pubmed/35059636 http://dx.doi.org/10.3389/frai.2021.778060 Text en Copyright © 2022 Cohn and Schilperoord. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Artificial Intelligence
Cohn, Neil
Schilperoord, Joost
Remarks on Multimodality: Grammatical Interactions in the Parallel Architecture
title Remarks on Multimodality: Grammatical Interactions in the Parallel Architecture
title_full Remarks on Multimodality: Grammatical Interactions in the Parallel Architecture
title_fullStr Remarks on Multimodality: Grammatical Interactions in the Parallel Architecture
title_full_unstemmed Remarks on Multimodality: Grammatical Interactions in the Parallel Architecture
title_short Remarks on Multimodality: Grammatical Interactions in the Parallel Architecture
title_sort remarks on multimodality: grammatical interactions in the parallel architecture
topic Artificial Intelligence
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8764459/
https://www.ncbi.nlm.nih.gov/pubmed/35059636
http://dx.doi.org/10.3389/frai.2021.778060
work_keys_str_mv AT cohnneil remarksonmultimodalitygrammaticalinteractionsintheparallelarchitecture
AT schilperoordjoost remarksonmultimodalitygrammaticalinteractionsintheparallelarchitecture