Cargando…

Eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface

Eye tracking provides valuable insight for analyzing visual attention and underlying thinking progress through the observation of eye movements. Here, a transparent, flexible and ultra-persistent electrostatic sensing interface is proposed for realizing active eye tracking (AET) system based on the...

Descripción completa

Detalles Bibliográficos
Autores principales: Shi, Yuxiang, Yang, Peng, Lei, Rui, Liu, Zhaoqi, Dong, Xuanyi, Tao, Xinglin, Chu, Xiangcheng, Wang, Zhong Lin, Chen, Xiangyu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10247702/
https://www.ncbi.nlm.nih.gov/pubmed/37286541
http://dx.doi.org/10.1038/s41467-023-39068-2
_version_ 1785055213296549888
author Shi, Yuxiang
Yang, Peng
Lei, Rui
Liu, Zhaoqi
Dong, Xuanyi
Tao, Xinglin
Chu, Xiangcheng
Wang, Zhong Lin
Chen, Xiangyu
author_facet Shi, Yuxiang
Yang, Peng
Lei, Rui
Liu, Zhaoqi
Dong, Xuanyi
Tao, Xinglin
Chu, Xiangcheng
Wang, Zhong Lin
Chen, Xiangyu
author_sort Shi, Yuxiang
collection PubMed
description Eye tracking provides valuable insight for analyzing visual attention and underlying thinking progress through the observation of eye movements. Here, a transparent, flexible and ultra-persistent electrostatic sensing interface is proposed for realizing active eye tracking (AET) system based on the electrostatic induction effect. Through a triple-layer structure combined with a dielectric bilayer and a rough-surface Ag nanowire (Ag NW) electrode layer, the inherent capacitance and interfacial trapping density of the electrostatic interface has been strongly enhanced, contributing to an unprecedented charge storage capability. The electrostatic charge density of the interface reached 1671.10 μC·m(−2) with a charge-keeping rate of 96.91% after 1000 non-contact operation cycles, which can finally realize oculogyric detection with an angular resolution of 5°. Thus, the AET system enables real-time decoding eye movements for customer preference recording and eye-controlled human-computer interaction, supporting its limitless potentiality in commercial purpose, virtual reality, human computer interactions and medical monitoring.
format Online
Article
Text
id pubmed-10247702
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-102477022023-06-09 Eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface Shi, Yuxiang Yang, Peng Lei, Rui Liu, Zhaoqi Dong, Xuanyi Tao, Xinglin Chu, Xiangcheng Wang, Zhong Lin Chen, Xiangyu Nat Commun Article Eye tracking provides valuable insight for analyzing visual attention and underlying thinking progress through the observation of eye movements. Here, a transparent, flexible and ultra-persistent electrostatic sensing interface is proposed for realizing active eye tracking (AET) system based on the electrostatic induction effect. Through a triple-layer structure combined with a dielectric bilayer and a rough-surface Ag nanowire (Ag NW) electrode layer, the inherent capacitance and interfacial trapping density of the electrostatic interface has been strongly enhanced, contributing to an unprecedented charge storage capability. The electrostatic charge density of the interface reached 1671.10 μC·m(−2) with a charge-keeping rate of 96.91% after 1000 non-contact operation cycles, which can finally realize oculogyric detection with an angular resolution of 5°. Thus, the AET system enables real-time decoding eye movements for customer preference recording and eye-controlled human-computer interaction, supporting its limitless potentiality in commercial purpose, virtual reality, human computer interactions and medical monitoring. Nature Publishing Group UK 2023-06-07 /pmc/articles/PMC10247702/ /pubmed/37286541 http://dx.doi.org/10.1038/s41467-023-39068-2 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Shi, Yuxiang
Yang, Peng
Lei, Rui
Liu, Zhaoqi
Dong, Xuanyi
Tao, Xinglin
Chu, Xiangcheng
Wang, Zhong Lin
Chen, Xiangyu
Eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface
title Eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface
title_full Eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface
title_fullStr Eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface
title_full_unstemmed Eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface
title_short Eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface
title_sort eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10247702/
https://www.ncbi.nlm.nih.gov/pubmed/37286541
http://dx.doi.org/10.1038/s41467-023-39068-2
work_keys_str_mv AT shiyuxiang eyetrackingandeyeexpressiondecodingbasedontransparentflexibleandultrapersistentelectrostaticinterface
AT yangpeng eyetrackingandeyeexpressiondecodingbasedontransparentflexibleandultrapersistentelectrostaticinterface
AT leirui eyetrackingandeyeexpressiondecodingbasedontransparentflexibleandultrapersistentelectrostaticinterface
AT liuzhaoqi eyetrackingandeyeexpressiondecodingbasedontransparentflexibleandultrapersistentelectrostaticinterface
AT dongxuanyi eyetrackingandeyeexpressiondecodingbasedontransparentflexibleandultrapersistentelectrostaticinterface
AT taoxinglin eyetrackingandeyeexpressiondecodingbasedontransparentflexibleandultrapersistentelectrostaticinterface
AT chuxiangcheng eyetrackingandeyeexpressiondecodingbasedontransparentflexibleandultrapersistentelectrostaticinterface
AT wangzhonglin eyetrackingandeyeexpressiondecodingbasedontransparentflexibleandultrapersistentelectrostaticinterface
AT chenxiangyu eyetrackingandeyeexpressiondecodingbasedontransparentflexibleandultrapersistentelectrostaticinterface