Cargando…
Easier Said Than Done? Task Difficulty's Influence on Temporal Alignment, Semantic Similarity, and Complexity Matching Between Gestures and Speech
Gestures and speech are clearly synchronized in many ways. However, previous studies have shown that the semantic similarity between gestures and speech breaks down as people approach transitions in understanding. Explanations for these gesture–speech mismatches, which focus on gestures and speech e...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
John Wiley and Sons Inc.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8365723/ https://www.ncbi.nlm.nih.gov/pubmed/34170013 http://dx.doi.org/10.1111/cogs.12989 |
_version_ | 1783738766604632064 |
---|---|
author | De Jonge‐Hoekstra, Lisette Cox, Ralf F.A. Van der Steen, Steffie Dixon, James A. |
author_facet | De Jonge‐Hoekstra, Lisette Cox, Ralf F.A. Van der Steen, Steffie Dixon, James A. |
author_sort | De Jonge‐Hoekstra, Lisette |
collection | PubMed |
description | Gestures and speech are clearly synchronized in many ways. However, previous studies have shown that the semantic similarity between gestures and speech breaks down as people approach transitions in understanding. Explanations for these gesture–speech mismatches, which focus on gestures and speech expressing different cognitive strategies, have been criticized for disregarding gestures’ and speech's integration and synchronization. In the current study, we applied three different perspectives to investigate gesture–speech synchronization in an easy and a difficult task: temporal alignment, semantic similarity, and complexity matching. Participants engaged in a simple cognitive task and were assigned to either an easy or a difficult condition. We automatically measured pointing gestures, and we coded participant's speech, to determine the temporal alignment and semantic similarity between gestures and speech. Multifractal detrended fluctuation analysis was used to determine the extent of complexity matching between gestures and speech. We found that task difficulty indeed influenced gesture–speech synchronization in all three domains. We thereby extended the phenomenon of gesture–speech mismatches to difficult tasks in general. Furthermore, we investigated how temporal alignment, semantic similarity, and complexity matching were related in each condition, and how they predicted participants’ task performance. Our study illustrates how combining multiple perspectives, originating from different research areas (i.e., coordination dynamics, complexity science, cognitive psychology), provides novel understanding about cognitive concepts in general and about gesture–speech synchronization and task difficulty in particular. |
format | Online Article Text |
id | pubmed-8365723 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | John Wiley and Sons Inc. |
record_format | MEDLINE/PubMed |
spelling | pubmed-83657232021-08-23 Easier Said Than Done? Task Difficulty's Influence on Temporal Alignment, Semantic Similarity, and Complexity Matching Between Gestures and Speech De Jonge‐Hoekstra, Lisette Cox, Ralf F.A. Van der Steen, Steffie Dixon, James A. Cogn Sci Regular Articles Gestures and speech are clearly synchronized in many ways. However, previous studies have shown that the semantic similarity between gestures and speech breaks down as people approach transitions in understanding. Explanations for these gesture–speech mismatches, which focus on gestures and speech expressing different cognitive strategies, have been criticized for disregarding gestures’ and speech's integration and synchronization. In the current study, we applied three different perspectives to investigate gesture–speech synchronization in an easy and a difficult task: temporal alignment, semantic similarity, and complexity matching. Participants engaged in a simple cognitive task and were assigned to either an easy or a difficult condition. We automatically measured pointing gestures, and we coded participant's speech, to determine the temporal alignment and semantic similarity between gestures and speech. Multifractal detrended fluctuation analysis was used to determine the extent of complexity matching between gestures and speech. We found that task difficulty indeed influenced gesture–speech synchronization in all three domains. We thereby extended the phenomenon of gesture–speech mismatches to difficult tasks in general. Furthermore, we investigated how temporal alignment, semantic similarity, and complexity matching were related in each condition, and how they predicted participants’ task performance. Our study illustrates how combining multiple perspectives, originating from different research areas (i.e., coordination dynamics, complexity science, cognitive psychology), provides novel understanding about cognitive concepts in general and about gesture–speech synchronization and task difficulty in particular. John Wiley and Sons Inc. 2021-06-25 2021-06 /pmc/articles/PMC8365723/ /pubmed/34170013 http://dx.doi.org/10.1111/cogs.12989 Text en © 2021 The Authors. Cognitive Science published by Wiley Periodicals LLC on behalf of Cognitive Science Society (CSS). https://creativecommons.org/licenses/by-nc-nd/4.0/This is an open access article under the terms of the http://creativecommons.org/licenses/by-nc-nd/4.0/ (https://creativecommons.org/licenses/by-nc-nd/4.0/) License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non‐commercial and no modifications or adaptations are made. |
spellingShingle | Regular Articles De Jonge‐Hoekstra, Lisette Cox, Ralf F.A. Van der Steen, Steffie Dixon, James A. Easier Said Than Done? Task Difficulty's Influence on Temporal Alignment, Semantic Similarity, and Complexity Matching Between Gestures and Speech |
title | Easier Said Than Done? Task Difficulty's Influence on Temporal Alignment, Semantic Similarity, and Complexity Matching Between Gestures and Speech |
title_full | Easier Said Than Done? Task Difficulty's Influence on Temporal Alignment, Semantic Similarity, and Complexity Matching Between Gestures and Speech |
title_fullStr | Easier Said Than Done? Task Difficulty's Influence on Temporal Alignment, Semantic Similarity, and Complexity Matching Between Gestures and Speech |
title_full_unstemmed | Easier Said Than Done? Task Difficulty's Influence on Temporal Alignment, Semantic Similarity, and Complexity Matching Between Gestures and Speech |
title_short | Easier Said Than Done? Task Difficulty's Influence on Temporal Alignment, Semantic Similarity, and Complexity Matching Between Gestures and Speech |
title_sort | easier said than done? task difficulty's influence on temporal alignment, semantic similarity, and complexity matching between gestures and speech |
topic | Regular Articles |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8365723/ https://www.ncbi.nlm.nih.gov/pubmed/34170013 http://dx.doi.org/10.1111/cogs.12989 |
work_keys_str_mv | AT dejongehoekstralisette easiersaidthandonetaskdifficultysinfluenceontemporalalignmentsemanticsimilarityandcomplexitymatchingbetweengesturesandspeech AT coxralffa easiersaidthandonetaskdifficultysinfluenceontemporalalignmentsemanticsimilarityandcomplexitymatchingbetweengesturesandspeech AT vandersteensteffie easiersaidthandonetaskdifficultysinfluenceontemporalalignmentsemanticsimilarityandcomplexitymatchingbetweengesturesandspeech AT dixonjamesa easiersaidthandonetaskdifficultysinfluenceontemporalalignmentsemanticsimilarityandcomplexitymatchingbetweengesturesandspeech |