Cargando…
Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning
Developing a user interface (UI) suitable for headset environments is one of the challenges in the field of augmented reality (AR) technologies. This study proposes a hands-free UI for an AR headset that exploits facial gestures of the wearer to recognize user intentions. The facial gestures of the...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6832972/ https://www.ncbi.nlm.nih.gov/pubmed/31614988 http://dx.doi.org/10.3390/s19204441 |
_version_ | 1783466269377298432 |
---|---|
author | Cha, Jaekwang Kim, Jinhyuk Kim, Shiho |
author_facet | Cha, Jaekwang Kim, Jinhyuk Kim, Shiho |
author_sort | Cha, Jaekwang |
collection | PubMed |
description | Developing a user interface (UI) suitable for headset environments is one of the challenges in the field of augmented reality (AR) technologies. This study proposes a hands-free UI for an AR headset that exploits facial gestures of the wearer to recognize user intentions. The facial gestures of the headset wearer are detected by a custom-designed sensor that detects skin deformation based on infrared diffusion characteristics of human skin. We designed a deep neural network classifier to determine the user’s intended gestures from skin-deformation data, which are exploited as user input commands for the proposed UI system. The proposed classifier is composed of a spatiotemporal autoencoder and deep embedded clustering algorithm, trained in an unsupervised manner. The UI device was embedded in a commercial AR headset, and several experiments were performed on the online sensor data to verify operation of the device. We achieved implementation of a hands-free UI for an AR headset with average accuracy of 95.4% user-command recognition, as determined through tests by participants. |
format | Online Article Text |
id | pubmed-6832972 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-68329722019-11-25 Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning Cha, Jaekwang Kim, Jinhyuk Kim, Shiho Sensors (Basel) Article Developing a user interface (UI) suitable for headset environments is one of the challenges in the field of augmented reality (AR) technologies. This study proposes a hands-free UI for an AR headset that exploits facial gestures of the wearer to recognize user intentions. The facial gestures of the headset wearer are detected by a custom-designed sensor that detects skin deformation based on infrared diffusion characteristics of human skin. We designed a deep neural network classifier to determine the user’s intended gestures from skin-deformation data, which are exploited as user input commands for the proposed UI system. The proposed classifier is composed of a spatiotemporal autoencoder and deep embedded clustering algorithm, trained in an unsupervised manner. The UI device was embedded in a commercial AR headset, and several experiments were performed on the online sensor data to verify operation of the device. We achieved implementation of a hands-free UI for an AR headset with average accuracy of 95.4% user-command recognition, as determined through tests by participants. MDPI 2019-10-14 /pmc/articles/PMC6832972/ /pubmed/31614988 http://dx.doi.org/10.3390/s19204441 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Cha, Jaekwang Kim, Jinhyuk Kim, Shiho Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning |
title | Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning |
title_full | Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning |
title_fullStr | Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning |
title_full_unstemmed | Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning |
title_short | Hands-Free User Interface for AR/VR Devices Exploiting Wearer’s Facial Gestures Using Unsupervised Deep Learning |
title_sort | hands-free user interface for ar/vr devices exploiting wearer’s facial gestures using unsupervised deep learning |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6832972/ https://www.ncbi.nlm.nih.gov/pubmed/31614988 http://dx.doi.org/10.3390/s19204441 |
work_keys_str_mv | AT chajaekwang handsfreeuserinterfaceforarvrdevicesexploitingwearersfacialgesturesusingunsuperviseddeeplearning AT kimjinhyuk handsfreeuserinterfaceforarvrdevicesexploitingwearersfacialgesturesusingunsuperviseddeeplearning AT kimshiho handsfreeuserinterfaceforarvrdevicesexploitingwearersfacialgesturesusingunsuperviseddeeplearning |