Cargando…
Look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research
BACKGROUND: Avoidance to look others in the eye is a characteristic symptom of Autism Spectrum Disorders (ASD), and it has been hypothesised that quantitative monitoring of gaze patterns could be useful to objectively evaluate treatments. However, tools to measure gaze behaviour on a regular basis a...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6499948/ https://www.ncbi.nlm.nih.gov/pubmed/31053071 http://dx.doi.org/10.1186/s12938-019-0670-1 |
_version_ | 1783415856353509376 |
---|---|
author | Strobl, Maximilian A. R. Lipsmeier, Florian Demenescu, Liliana R. Gossens, Christian Lindemann, Michael De Vos, Maarten |
author_facet | Strobl, Maximilian A. R. Lipsmeier, Florian Demenescu, Liliana R. Gossens, Christian Lindemann, Michael De Vos, Maarten |
author_sort | Strobl, Maximilian A. R. |
collection | PubMed |
description | BACKGROUND: Avoidance to look others in the eye is a characteristic symptom of Autism Spectrum Disorders (ASD), and it has been hypothesised that quantitative monitoring of gaze patterns could be useful to objectively evaluate treatments. However, tools to measure gaze behaviour on a regular basis at a manageable cost are missing. In this paper, we investigated whether a smartphone-based tool could address this problem. Specifically, we assessed the accuracy with which the phone-based, state-of-the-art eye-tracking algorithm iTracker can distinguish between gaze towards the eyes and the mouth of a face displayed on the smartphone screen. This might allow mobile, longitudinal monitoring of gaze aversion behaviour in ASD patients in the future. RESULTS: We simulated a smartphone application in which subjects were shown an image on the screen and their gaze was analysed using iTracker. We evaluated the accuracy of our set-up across three tasks in a cohort of 17 healthy volunteers. In the first two tasks, subjects were shown different-sized images of a face and asked to alternate their gaze focus between the eyes and the mouth. In the last task, participants were asked to trace out a circle on the screen with their eyes. We confirm that iTracker can recapitulate the true gaze patterns, and capture relative position of gaze correctly, even on a different phone system to what it was trained on. Subject-specific bias can be corrected using an error model informed from the calibration data. We compare two calibration methods and observe that a linear model performs better than a previously proposed support vector regression-based method. CONCLUSIONS: Under controlled conditions it is possible to reliably distinguish between gaze towards the eyes and the mouth with a smartphone-based set-up. However, future research will be required to improve the robustness of the system to roll angle of the phone and distance between the user and the screen to allow deployment in a home setting. We conclude that a smartphone-based gaze-monitoring tool provides promising opportunities for more quantitative monitoring of ASD. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (10.1186/s12938-019-0670-1) contains supplementary material, which is available to authorized users. |
format | Online Article Text |
id | pubmed-6499948 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-64999482019-05-09 Look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research Strobl, Maximilian A. R. Lipsmeier, Florian Demenescu, Liliana R. Gossens, Christian Lindemann, Michael De Vos, Maarten Biomed Eng Online Research BACKGROUND: Avoidance to look others in the eye is a characteristic symptom of Autism Spectrum Disorders (ASD), and it has been hypothesised that quantitative monitoring of gaze patterns could be useful to objectively evaluate treatments. However, tools to measure gaze behaviour on a regular basis at a manageable cost are missing. In this paper, we investigated whether a smartphone-based tool could address this problem. Specifically, we assessed the accuracy with which the phone-based, state-of-the-art eye-tracking algorithm iTracker can distinguish between gaze towards the eyes and the mouth of a face displayed on the smartphone screen. This might allow mobile, longitudinal monitoring of gaze aversion behaviour in ASD patients in the future. RESULTS: We simulated a smartphone application in which subjects were shown an image on the screen and their gaze was analysed using iTracker. We evaluated the accuracy of our set-up across three tasks in a cohort of 17 healthy volunteers. In the first two tasks, subjects were shown different-sized images of a face and asked to alternate their gaze focus between the eyes and the mouth. In the last task, participants were asked to trace out a circle on the screen with their eyes. We confirm that iTracker can recapitulate the true gaze patterns, and capture relative position of gaze correctly, even on a different phone system to what it was trained on. Subject-specific bias can be corrected using an error model informed from the calibration data. We compare two calibration methods and observe that a linear model performs better than a previously proposed support vector regression-based method. CONCLUSIONS: Under controlled conditions it is possible to reliably distinguish between gaze towards the eyes and the mouth with a smartphone-based set-up. However, future research will be required to improve the robustness of the system to roll angle of the phone and distance between the user and the screen to allow deployment in a home setting. We conclude that a smartphone-based gaze-monitoring tool provides promising opportunities for more quantitative monitoring of ASD. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (10.1186/s12938-019-0670-1) contains supplementary material, which is available to authorized users. BioMed Central 2019-05-03 /pmc/articles/PMC6499948/ /pubmed/31053071 http://dx.doi.org/10.1186/s12938-019-0670-1 Text en © The Author(s) 2019 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated. |
spellingShingle | Research Strobl, Maximilian A. R. Lipsmeier, Florian Demenescu, Liliana R. Gossens, Christian Lindemann, Michael De Vos, Maarten Look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research |
title | Look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research |
title_full | Look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research |
title_fullStr | Look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research |
title_full_unstemmed | Look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research |
title_short | Look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research |
title_sort | look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research |
topic | Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6499948/ https://www.ncbi.nlm.nih.gov/pubmed/31053071 http://dx.doi.org/10.1186/s12938-019-0670-1 |
work_keys_str_mv | AT stroblmaximilianar lookmeintheeyeevaluatingtheaccuracyofsmartphonebasedeyetrackingforpotentialapplicationinautismspectrumdisorderresearch AT lipsmeierflorian lookmeintheeyeevaluatingtheaccuracyofsmartphonebasedeyetrackingforpotentialapplicationinautismspectrumdisorderresearch AT demenesculilianar lookmeintheeyeevaluatingtheaccuracyofsmartphonebasedeyetrackingforpotentialapplicationinautismspectrumdisorderresearch AT gossenschristian lookmeintheeyeevaluatingtheaccuracyofsmartphonebasedeyetrackingforpotentialapplicationinautismspectrumdisorderresearch AT lindemannmichael lookmeintheeyeevaluatingtheaccuracyofsmartphonebasedeyetrackingforpotentialapplicationinautismspectrumdisorderresearch AT devosmaarten lookmeintheeyeevaluatingtheaccuracyofsmartphonebasedeyetrackingforpotentialapplicationinautismspectrumdisorderresearch |