Cargando…

Evidence for the adaptive parsing of non-communicative eye movements during joint attention interactions

During social interactions, the ability to detect and respond to gaze-based joint attention bids often involves the evaluation of non-communicative eye movements. However, very little is known about how much humans are able to track and parse spatial information from these non-communicative eye move...

Descripción completa

Detalles Bibliográficos
Autores principales: Alhasan, Ayeh, Caruana, Nathan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: PeerJ Inc. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10668824/
https://www.ncbi.nlm.nih.gov/pubmed/38025743
http://dx.doi.org/10.7717/peerj.16363
_version_ 1785149183012896768
author Alhasan, Ayeh
Caruana, Nathan
author_facet Alhasan, Ayeh
Caruana, Nathan
author_sort Alhasan, Ayeh
collection PubMed
description During social interactions, the ability to detect and respond to gaze-based joint attention bids often involves the evaluation of non-communicative eye movements. However, very little is known about how much humans are able to track and parse spatial information from these non-communicative eye movements over time, and the extent to which this influences joint attention outcomes. This was investigated in the current study using an interactive computer-based joint attention game. Using a fully within-subjects design, we specifically examined whether participants were quicker to respond to communicative joint attention bids that followed predictive, as opposed to random or no, non-communicative gaze behaviour. Our results suggest that in complex, dynamic tasks, people adaptively use and dismiss non-communicative gaze information depending on whether it informs the locus of an upcoming joint attention bid. We also went further to examine the extent to which this ability to track dynamic spatial information was specific to processing gaze information. This was achieved by comparing performance to a closely matched non-social task where eye gaze cues were replaced with dynamic arrow stimuli. Whilst we found that people are also able to track and use dynamic non-social information from arrows, there was clear evidence for a relative advantage for tracking gaze cues during social interactions. The implications of these findings for social neuroscience and autism research are discussed.
format Online
Article
Text
id pubmed-10668824
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher PeerJ Inc.
record_format MEDLINE/PubMed
spelling pubmed-106688242023-11-21 Evidence for the adaptive parsing of non-communicative eye movements during joint attention interactions Alhasan, Ayeh Caruana, Nathan PeerJ Cognitive Disorders During social interactions, the ability to detect and respond to gaze-based joint attention bids often involves the evaluation of non-communicative eye movements. However, very little is known about how much humans are able to track and parse spatial information from these non-communicative eye movements over time, and the extent to which this influences joint attention outcomes. This was investigated in the current study using an interactive computer-based joint attention game. Using a fully within-subjects design, we specifically examined whether participants were quicker to respond to communicative joint attention bids that followed predictive, as opposed to random or no, non-communicative gaze behaviour. Our results suggest that in complex, dynamic tasks, people adaptively use and dismiss non-communicative gaze information depending on whether it informs the locus of an upcoming joint attention bid. We also went further to examine the extent to which this ability to track dynamic spatial information was specific to processing gaze information. This was achieved by comparing performance to a closely matched non-social task where eye gaze cues were replaced with dynamic arrow stimuli. Whilst we found that people are also able to track and use dynamic non-social information from arrows, there was clear evidence for a relative advantage for tracking gaze cues during social interactions. The implications of these findings for social neuroscience and autism research are discussed. PeerJ Inc. 2023-11-21 /pmc/articles/PMC10668824/ /pubmed/38025743 http://dx.doi.org/10.7717/peerj.16363 Text en ©2023 Alhasan et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ) and either DOI or URL of the article must be cited.
spellingShingle Cognitive Disorders
Alhasan, Ayeh
Caruana, Nathan
Evidence for the adaptive parsing of non-communicative eye movements during joint attention interactions
title Evidence for the adaptive parsing of non-communicative eye movements during joint attention interactions
title_full Evidence for the adaptive parsing of non-communicative eye movements during joint attention interactions
title_fullStr Evidence for the adaptive parsing of non-communicative eye movements during joint attention interactions
title_full_unstemmed Evidence for the adaptive parsing of non-communicative eye movements during joint attention interactions
title_short Evidence for the adaptive parsing of non-communicative eye movements during joint attention interactions
title_sort evidence for the adaptive parsing of non-communicative eye movements during joint attention interactions
topic Cognitive Disorders
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10668824/
https://www.ncbi.nlm.nih.gov/pubmed/38025743
http://dx.doi.org/10.7717/peerj.16363
work_keys_str_mv AT alhasanayeh evidencefortheadaptiveparsingofnoncommunicativeeyemovementsduringjointattentioninteractions
AT caruananathan evidencefortheadaptiveparsingofnoncommunicativeeyemovementsduringjointattentioninteractions