Cargando…
Data-driven group comparisons of eye fixations to dynamic stimuli
Recent advances in software and hardware have allowed eye tracking to move away from static images to more ecologically relevant video streams. The analysis of eye tracking data for such dynamic stimuli, however, is not without challenges. The frame-by-frame coding of regions of interest (ROIs) is l...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
SAGE Publications
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9016662/ https://www.ncbi.nlm.nih.gov/pubmed/34507503 http://dx.doi.org/10.1177/17470218211048060 |
_version_ | 1784688574445125632 |
---|---|
author | Onwuegbusi, Tochukwu Hermens, Frouke Hogue, Todd |
author_facet | Onwuegbusi, Tochukwu Hermens, Frouke Hogue, Todd |
author_sort | Onwuegbusi, Tochukwu |
collection | PubMed |
description | Recent advances in software and hardware have allowed eye tracking to move away from static images to more ecologically relevant video streams. The analysis of eye tracking data for such dynamic stimuli, however, is not without challenges. The frame-by-frame coding of regions of interest (ROIs) is labour-intensive and computer vision techniques to automatically code such ROIs are not yet mainstream, restricting the use of such stimuli. Combined with the more general problem of defining relevant ROIs for video frames, methods are needed that facilitate data analysis. Here, we present a first evaluation of an easy-to-implement data-driven method with the potential to address these issues. To test the new method, we examined the differences in eye movements of self-reported politically left- or right-wing leaning participants to video clips of left- and right-wing politicians. The results show that our method can accurately predict group membership on the basis of eye movement patterns, isolate video clips that best distinguish people on the political left–right spectrum, and reveal the section of each video clip with the largest group differences. Our methodology thereby aids the understanding of group differences in gaze behaviour, and the identification of critical stimuli for follow-up studies or for use in saccade diagnosis. |
format | Online Article Text |
id | pubmed-9016662 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | SAGE Publications |
record_format | MEDLINE/PubMed |
spelling | pubmed-90166622022-04-20 Data-driven group comparisons of eye fixations to dynamic stimuli Onwuegbusi, Tochukwu Hermens, Frouke Hogue, Todd Q J Exp Psychol (Hove) Original Articles Recent advances in software and hardware have allowed eye tracking to move away from static images to more ecologically relevant video streams. The analysis of eye tracking data for such dynamic stimuli, however, is not without challenges. The frame-by-frame coding of regions of interest (ROIs) is labour-intensive and computer vision techniques to automatically code such ROIs are not yet mainstream, restricting the use of such stimuli. Combined with the more general problem of defining relevant ROIs for video frames, methods are needed that facilitate data analysis. Here, we present a first evaluation of an easy-to-implement data-driven method with the potential to address these issues. To test the new method, we examined the differences in eye movements of self-reported politically left- or right-wing leaning participants to video clips of left- and right-wing politicians. The results show that our method can accurately predict group membership on the basis of eye movement patterns, isolate video clips that best distinguish people on the political left–right spectrum, and reveal the section of each video clip with the largest group differences. Our methodology thereby aids the understanding of group differences in gaze behaviour, and the identification of critical stimuli for follow-up studies or for use in saccade diagnosis. SAGE Publications 2021-09-29 2022-06 /pmc/articles/PMC9016662/ /pubmed/34507503 http://dx.doi.org/10.1177/17470218211048060 Text en © Experimental Psychology Society 2021 https://creativecommons.org/licenses/by/4.0/This article is distributed under the terms of the Creative Commons Attribution 4.0 License (https://creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access page (https://us.sagepub.com/en-us/nam/open-access-at-sage). |
spellingShingle | Original Articles Onwuegbusi, Tochukwu Hermens, Frouke Hogue, Todd Data-driven group comparisons of eye fixations to dynamic stimuli |
title | Data-driven group comparisons of eye fixations to dynamic
stimuli |
title_full | Data-driven group comparisons of eye fixations to dynamic
stimuli |
title_fullStr | Data-driven group comparisons of eye fixations to dynamic
stimuli |
title_full_unstemmed | Data-driven group comparisons of eye fixations to dynamic
stimuli |
title_short | Data-driven group comparisons of eye fixations to dynamic
stimuli |
title_sort | data-driven group comparisons of eye fixations to dynamic
stimuli |
topic | Original Articles |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9016662/ https://www.ncbi.nlm.nih.gov/pubmed/34507503 http://dx.doi.org/10.1177/17470218211048060 |
work_keys_str_mv | AT onwuegbusitochukwu datadrivengroupcomparisonsofeyefixationstodynamicstimuli AT hermensfrouke datadrivengroupcomparisonsofeyefixationstodynamicstimuli AT hoguetodd datadrivengroupcomparisonsofeyefixationstodynamicstimuli |