Cargando…

iCatcher: A neural network approach for automated coding of young children's eye movements

Infants' looking behaviors are often used for measuring attention, real‐time processing, and learning—often using low‐resolution videos. Despite the ubiquity of gaze‐related methods in developmental science, current analysis techniques usually involve laborious post hoc coding, imprecise real‐t...

Descripción completa

Detalles Bibliográficos
Autores principales: Erel, Yotam, Potter, Christine E., Jaffe‐Dax, Sagi, Lew‐Williams, Casey, Bermano, Amit H.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley and Sons Inc. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9320879/
https://www.ncbi.nlm.nih.gov/pubmed/35416378
http://dx.doi.org/10.1111/infa.12468
_version_ 1784755900854042624
author Erel, Yotam
Potter, Christine E.
Jaffe‐Dax, Sagi
Lew‐Williams, Casey
Bermano, Amit H.
author_facet Erel, Yotam
Potter, Christine E.
Jaffe‐Dax, Sagi
Lew‐Williams, Casey
Bermano, Amit H.
author_sort Erel, Yotam
collection PubMed
description Infants' looking behaviors are often used for measuring attention, real‐time processing, and learning—often using low‐resolution videos. Despite the ubiquity of gaze‐related methods in developmental science, current analysis techniques usually involve laborious post hoc coding, imprecise real‐time coding, or expensive eye trackers that may increase data loss and require a calibration phase. As an alternative, we propose using computer vision methods to perform automatic gaze estimation from low‐resolution videos. At the core of our approach is a neural network that classifies gaze directions in real time. We compared our method, called iCatcher, to manually annotated videos from a prior study in which infants looked at one of two pictures on a screen. We demonstrated that the accuracy of iCatcher approximates that of human annotators and that it replicates the prior study's results. Our method is publicly available as an open‐source repository at https://github.com/yoterel/iCatcher.
format Online
Article
Text
id pubmed-9320879
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher John Wiley and Sons Inc.
record_format MEDLINE/PubMed
spelling pubmed-93208792022-07-30 iCatcher: A neural network approach for automated coding of young children's eye movements Erel, Yotam Potter, Christine E. Jaffe‐Dax, Sagi Lew‐Williams, Casey Bermano, Amit H. Infancy Research Articles Infants' looking behaviors are often used for measuring attention, real‐time processing, and learning—often using low‐resolution videos. Despite the ubiquity of gaze‐related methods in developmental science, current analysis techniques usually involve laborious post hoc coding, imprecise real‐time coding, or expensive eye trackers that may increase data loss and require a calibration phase. As an alternative, we propose using computer vision methods to perform automatic gaze estimation from low‐resolution videos. At the core of our approach is a neural network that classifies gaze directions in real time. We compared our method, called iCatcher, to manually annotated videos from a prior study in which infants looked at one of two pictures on a screen. We demonstrated that the accuracy of iCatcher approximates that of human annotators and that it replicates the prior study's results. Our method is publicly available as an open‐source repository at https://github.com/yoterel/iCatcher. John Wiley and Sons Inc. 2022-04-13 2022 /pmc/articles/PMC9320879/ /pubmed/35416378 http://dx.doi.org/10.1111/infa.12468 Text en © 2022 The Authors. Infancy published by Wiley Periodicals LLC on behalf of International Congress of Infant Studies. https://creativecommons.org/licenses/by/4.0/This is an open access article under the terms of the http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Articles
Erel, Yotam
Potter, Christine E.
Jaffe‐Dax, Sagi
Lew‐Williams, Casey
Bermano, Amit H.
iCatcher: A neural network approach for automated coding of young children's eye movements
title iCatcher: A neural network approach for automated coding of young children's eye movements
title_full iCatcher: A neural network approach for automated coding of young children's eye movements
title_fullStr iCatcher: A neural network approach for automated coding of young children's eye movements
title_full_unstemmed iCatcher: A neural network approach for automated coding of young children's eye movements
title_short iCatcher: A neural network approach for automated coding of young children's eye movements
title_sort icatcher: a neural network approach for automated coding of young children's eye movements
topic Research Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9320879/
https://www.ncbi.nlm.nih.gov/pubmed/35416378
http://dx.doi.org/10.1111/infa.12468
work_keys_str_mv AT erelyotam icatcheraneuralnetworkapproachforautomatedcodingofyoungchildrenseyemovements
AT potterchristinee icatcheraneuralnetworkapproachforautomatedcodingofyoungchildrenseyemovements
AT jaffedaxsagi icatcheraneuralnetworkapproachforautomatedcodingofyoungchildrenseyemovements
AT lewwilliamscasey icatcheraneuralnetworkapproachforautomatedcodingofyoungchildrenseyemovements
AT bermanoamith icatcheraneuralnetworkapproachforautomatedcodingofyoungchildrenseyemovements