Cargando…

Integrating Gaze Tracking and Head-Motion Prediction for Mobile Device Authentication: A Proof of Concept

We introduce a two-stream model to use reflexive eye movements for smart mobile device authentication. Our model is based on two pre-trained neural networks, iTracker and PredNet, targeting two independent tasks: (i) gaze tracking and (ii) future frame prediction. We design a procedure to randomly g...

Descripción completa

Detalles Bibliográficos
Autores principales: Ma, Zhuo, Wang, Xinglong, Ma, Ruijie, Wang, Zhuzhu, Ma, Jianfeng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6164076/
https://www.ncbi.nlm.nih.gov/pubmed/30200380
http://dx.doi.org/10.3390/s18092894
_version_ 1783359514037190656
author Ma, Zhuo
Wang, Xinglong
Ma, Ruijie
Wang, Zhuzhu
Ma, Jianfeng
author_facet Ma, Zhuo
Wang, Xinglong
Ma, Ruijie
Wang, Zhuzhu
Ma, Jianfeng
author_sort Ma, Zhuo
collection PubMed
description We introduce a two-stream model to use reflexive eye movements for smart mobile device authentication. Our model is based on two pre-trained neural networks, iTracker and PredNet, targeting two independent tasks: (i) gaze tracking and (ii) future frame prediction. We design a procedure to randomly generate the visual stimulus on the screen of mobile device, and the frontal camera will simultaneously capture head motions of the user as one watches it. Then, iTracker calculates the gaze-coordinates error which is treated as a static feature. To solve the imprecise gaze-coordinates caused by the low resolution of the frontal camera, we further take advantage of PredNet to extract the dynamic features between consecutive frames. In order to resist traditional attacks (shoulder surfing and impersonation attacks) during the procedure of mobile device authentication, we innovatively combine static features and dynamic features to train a 2-class support vector machine (SVM) classifier. The experiment results show that the classifier achieves accuracy of 98.6% to authenticate the user identity of mobile devices.
format Online
Article
Text
id pubmed-6164076
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-61640762018-10-10 Integrating Gaze Tracking and Head-Motion Prediction for Mobile Device Authentication: A Proof of Concept Ma, Zhuo Wang, Xinglong Ma, Ruijie Wang, Zhuzhu Ma, Jianfeng Sensors (Basel) Article We introduce a two-stream model to use reflexive eye movements for smart mobile device authentication. Our model is based on two pre-trained neural networks, iTracker and PredNet, targeting two independent tasks: (i) gaze tracking and (ii) future frame prediction. We design a procedure to randomly generate the visual stimulus on the screen of mobile device, and the frontal camera will simultaneously capture head motions of the user as one watches it. Then, iTracker calculates the gaze-coordinates error which is treated as a static feature. To solve the imprecise gaze-coordinates caused by the low resolution of the frontal camera, we further take advantage of PredNet to extract the dynamic features between consecutive frames. In order to resist traditional attacks (shoulder surfing and impersonation attacks) during the procedure of mobile device authentication, we innovatively combine static features and dynamic features to train a 2-class support vector machine (SVM) classifier. The experiment results show that the classifier achieves accuracy of 98.6% to authenticate the user identity of mobile devices. MDPI 2018-08-31 /pmc/articles/PMC6164076/ /pubmed/30200380 http://dx.doi.org/10.3390/s18092894 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Ma, Zhuo
Wang, Xinglong
Ma, Ruijie
Wang, Zhuzhu
Ma, Jianfeng
Integrating Gaze Tracking and Head-Motion Prediction for Mobile Device Authentication: A Proof of Concept
title Integrating Gaze Tracking and Head-Motion Prediction for Mobile Device Authentication: A Proof of Concept
title_full Integrating Gaze Tracking and Head-Motion Prediction for Mobile Device Authentication: A Proof of Concept
title_fullStr Integrating Gaze Tracking and Head-Motion Prediction for Mobile Device Authentication: A Proof of Concept
title_full_unstemmed Integrating Gaze Tracking and Head-Motion Prediction for Mobile Device Authentication: A Proof of Concept
title_short Integrating Gaze Tracking and Head-Motion Prediction for Mobile Device Authentication: A Proof of Concept
title_sort integrating gaze tracking and head-motion prediction for mobile device authentication: a proof of concept
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6164076/
https://www.ncbi.nlm.nih.gov/pubmed/30200380
http://dx.doi.org/10.3390/s18092894
work_keys_str_mv AT mazhuo integratinggazetrackingandheadmotionpredictionformobiledeviceauthenticationaproofofconcept
AT wangxinglong integratinggazetrackingandheadmotionpredictionformobiledeviceauthenticationaproofofconcept
AT maruijie integratinggazetrackingandheadmotionpredictionformobiledeviceauthenticationaproofofconcept
AT wangzhuzhu integratinggazetrackingandheadmotionpredictionformobiledeviceauthenticationaproofofconcept
AT majianfeng integratinggazetrackingandheadmotionpredictionformobiledeviceauthenticationaproofofconcept