Cargando…
I2DNet - Design and Real-Time Evaluation of Appearance-based gaze estimation system
Gaze estimation problem can be addressed using either model-based or appearance-based approaches. Model-based approaches rely on features extracted from eye images to fit a 3D eye-ball model to obtain gaze point estimate while appearance-based methods attempt to directly map captured eye images to g...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Bern Open Publishing
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8561667/ https://www.ncbi.nlm.nih.gov/pubmed/34733445 http://dx.doi.org/10.16910/jemr.14.4.2 |
_version_ | 1784593133246349312 |
---|---|
author | Murthy, L R D Brahmbhatt, Siddhi Arjun, Somnath Biswas, Pradipta |
author_facet | Murthy, L R D Brahmbhatt, Siddhi Arjun, Somnath Biswas, Pradipta |
author_sort | Murthy, L R D |
collection | PubMed |
description | Gaze estimation problem can be addressed using either model-based or appearance-based approaches. Model-based approaches rely on features extracted from eye images to fit a 3D eye-ball model to obtain gaze point estimate while appearance-based methods attempt to directly map captured eye images to gaze point without any handcrafted features. Recently, availability of large datasets and novel deep learning techniques made appearance-based methods achieve superior accuracy than model-based approaches. However, many appearance- based gaze estimation systems perform well in within-dataset validation but fail to provide the same degree of accuracy in cross-dataset evaluation. Hence, it is still unclear how well the current state-of-the-art approaches perform in real-time in an interactive setting on unseen users. This paper proposes I2DNet, a novel architecture aimed to improve subject- independent gaze estimation accuracy that achieved a state-of-the-art 4.3 and 8.4 degree mean angle error on the MPIIGaze and RT-Gene datasets respectively. We have evaluated the proposed system as a gaze-controlled interface in real-time for a 9-block pointing and selection task and compared it with Webgazer.js and OpenFace 2.0. We have conducted a user study with 16 participants, and our proposed system reduces selection time and the number of missed selections statistically significantly compared to other two systems. |
format | Online Article Text |
id | pubmed-8561667 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Bern Open Publishing |
record_format | MEDLINE/PubMed |
spelling | pubmed-85616672021-11-02 I2DNet - Design and Real-Time Evaluation of Appearance-based gaze estimation system Murthy, L R D Brahmbhatt, Siddhi Arjun, Somnath Biswas, Pradipta J Eye Mov Res Research Article Gaze estimation problem can be addressed using either model-based or appearance-based approaches. Model-based approaches rely on features extracted from eye images to fit a 3D eye-ball model to obtain gaze point estimate while appearance-based methods attempt to directly map captured eye images to gaze point without any handcrafted features. Recently, availability of large datasets and novel deep learning techniques made appearance-based methods achieve superior accuracy than model-based approaches. However, many appearance- based gaze estimation systems perform well in within-dataset validation but fail to provide the same degree of accuracy in cross-dataset evaluation. Hence, it is still unclear how well the current state-of-the-art approaches perform in real-time in an interactive setting on unseen users. This paper proposes I2DNet, a novel architecture aimed to improve subject- independent gaze estimation accuracy that achieved a state-of-the-art 4.3 and 8.4 degree mean angle error on the MPIIGaze and RT-Gene datasets respectively. We have evaluated the proposed system as a gaze-controlled interface in real-time for a 9-block pointing and selection task and compared it with Webgazer.js and OpenFace 2.0. We have conducted a user study with 16 participants, and our proposed system reduces selection time and the number of missed selections statistically significantly compared to other two systems. Bern Open Publishing 2021-08-31 /pmc/articles/PMC8561667/ /pubmed/34733445 http://dx.doi.org/10.16910/jemr.14.4.2 Text en https://creativecommons.org/licenses/by/4.0/This work is licensed under a Creative Commons Attribution 4.0 International License, ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use and redistribution provided that the original author and source are credited. |
spellingShingle | Research Article Murthy, L R D Brahmbhatt, Siddhi Arjun, Somnath Biswas, Pradipta I2DNet - Design and Real-Time Evaluation of Appearance-based gaze estimation system |
title | I2DNet - Design and Real-Time Evaluation of Appearance-based gaze estimation system |
title_full | I2DNet - Design and Real-Time Evaluation of Appearance-based gaze estimation system |
title_fullStr | I2DNet - Design and Real-Time Evaluation of Appearance-based gaze estimation system |
title_full_unstemmed | I2DNet - Design and Real-Time Evaluation of Appearance-based gaze estimation system |
title_short | I2DNet - Design and Real-Time Evaluation of Appearance-based gaze estimation system |
title_sort | i2dnet - design and real-time evaluation of appearance-based gaze estimation system |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8561667/ https://www.ncbi.nlm.nih.gov/pubmed/34733445 http://dx.doi.org/10.16910/jemr.14.4.2 |
work_keys_str_mv | AT murthylrd i2dnetdesignandrealtimeevaluationofappearancebasedgazeestimationsystem AT brahmbhattsiddhi i2dnetdesignandrealtimeevaluationofappearancebasedgazeestimationsystem AT arjunsomnath i2dnetdesignandrealtimeevaluationofappearancebasedgazeestimationsystem AT biswaspradipta i2dnetdesignandrealtimeevaluationofappearancebasedgazeestimationsystem |