Cargando…
Accelerating eye movement research via accurate and affordable smartphone eye tracking
Eye tracking has been widely used for decades in vision research, language and usability. However, most prior research has focused on large desktop displays using specialized eye trackers that are expensive and cannot scale. Little is known about eye movement behavior on phones, despite their pervas...
Autores principales: | , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7486382/ https://www.ncbi.nlm.nih.gov/pubmed/32917902 http://dx.doi.org/10.1038/s41467-020-18360-5 |
_version_ | 1783581324019236864 |
---|---|
author | Valliappan, Nachiappan Dai, Na Steinberg, Ethan He, Junfeng Rogers, Kantwon Ramachandran, Venky Xu, Pingmei Shojaeizadeh, Mina Guo, Li Kohlhoff, Kai Navalpakkam, Vidhya |
author_facet | Valliappan, Nachiappan Dai, Na Steinberg, Ethan He, Junfeng Rogers, Kantwon Ramachandran, Venky Xu, Pingmei Shojaeizadeh, Mina Guo, Li Kohlhoff, Kai Navalpakkam, Vidhya |
author_sort | Valliappan, Nachiappan |
collection | PubMed |
description | Eye tracking has been widely used for decades in vision research, language and usability. However, most prior research has focused on large desktop displays using specialized eye trackers that are expensive and cannot scale. Little is known about eye movement behavior on phones, despite their pervasiveness and large amount of time spent. We leverage machine learning to demonstrate accurate smartphone-based eye tracking without any additional hardware. We show that the accuracy of our method is comparable to state-of-the-art mobile eye trackers that are 100x more expensive. Using data from over 100 opted-in users, we replicate key findings from previous eye movement research on oculomotor tasks and saliency analyses during natural image viewing. In addition, we demonstrate the utility of smartphone-based gaze for detecting reading comprehension difficulty. Our results show the potential for scaling eye movement research by orders-of-magnitude to thousands of participants (with explicit consent), enabling advances in vision research, accessibility and healthcare. |
format | Online Article Text |
id | pubmed-7486382 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-74863822020-09-21 Accelerating eye movement research via accurate and affordable smartphone eye tracking Valliappan, Nachiappan Dai, Na Steinberg, Ethan He, Junfeng Rogers, Kantwon Ramachandran, Venky Xu, Pingmei Shojaeizadeh, Mina Guo, Li Kohlhoff, Kai Navalpakkam, Vidhya Nat Commun Article Eye tracking has been widely used for decades in vision research, language and usability. However, most prior research has focused on large desktop displays using specialized eye trackers that are expensive and cannot scale. Little is known about eye movement behavior on phones, despite their pervasiveness and large amount of time spent. We leverage machine learning to demonstrate accurate smartphone-based eye tracking without any additional hardware. We show that the accuracy of our method is comparable to state-of-the-art mobile eye trackers that are 100x more expensive. Using data from over 100 opted-in users, we replicate key findings from previous eye movement research on oculomotor tasks and saliency analyses during natural image viewing. In addition, we demonstrate the utility of smartphone-based gaze for detecting reading comprehension difficulty. Our results show the potential for scaling eye movement research by orders-of-magnitude to thousands of participants (with explicit consent), enabling advances in vision research, accessibility and healthcare. Nature Publishing Group UK 2020-09-11 /pmc/articles/PMC7486382/ /pubmed/32917902 http://dx.doi.org/10.1038/s41467-020-18360-5 Text en © The Author(s) 2020 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. |
spellingShingle | Article Valliappan, Nachiappan Dai, Na Steinberg, Ethan He, Junfeng Rogers, Kantwon Ramachandran, Venky Xu, Pingmei Shojaeizadeh, Mina Guo, Li Kohlhoff, Kai Navalpakkam, Vidhya Accelerating eye movement research via accurate and affordable smartphone eye tracking |
title | Accelerating eye movement research via accurate and affordable smartphone eye tracking |
title_full | Accelerating eye movement research via accurate and affordable smartphone eye tracking |
title_fullStr | Accelerating eye movement research via accurate and affordable smartphone eye tracking |
title_full_unstemmed | Accelerating eye movement research via accurate and affordable smartphone eye tracking |
title_short | Accelerating eye movement research via accurate and affordable smartphone eye tracking |
title_sort | accelerating eye movement research via accurate and affordable smartphone eye tracking |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7486382/ https://www.ncbi.nlm.nih.gov/pubmed/32917902 http://dx.doi.org/10.1038/s41467-020-18360-5 |
work_keys_str_mv | AT valliappannachiappan acceleratingeyemovementresearchviaaccurateandaffordablesmartphoneeyetracking AT daina acceleratingeyemovementresearchviaaccurateandaffordablesmartphoneeyetracking AT steinbergethan acceleratingeyemovementresearchviaaccurateandaffordablesmartphoneeyetracking AT hejunfeng acceleratingeyemovementresearchviaaccurateandaffordablesmartphoneeyetracking AT rogerskantwon acceleratingeyemovementresearchviaaccurateandaffordablesmartphoneeyetracking AT ramachandranvenky acceleratingeyemovementresearchviaaccurateandaffordablesmartphoneeyetracking AT xupingmei acceleratingeyemovementresearchviaaccurateandaffordablesmartphoneeyetracking AT shojaeizadehmina acceleratingeyemovementresearchviaaccurateandaffordablesmartphoneeyetracking AT guoli acceleratingeyemovementresearchviaaccurateandaffordablesmartphoneeyetracking AT kohlhoffkai acceleratingeyemovementresearchviaaccurateandaffordablesmartphoneeyetracking AT navalpakkamvidhya acceleratingeyemovementresearchviaaccurateandaffordablesmartphoneeyetracking |