Cargando…
Hands-Free User Interface for VR Headsets Based on In Situ Facial Gesture Sensing
The typical configuration of virtual reality (VR) devices consists of a head-mounted display (HMD) and handheld controllers. As such, these units have limited utility in tasks that require hand-free operation, such as in surgical operations or assembly works in cyberspace. We propose a user interfac...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7766087/ https://www.ncbi.nlm.nih.gov/pubmed/33339247 http://dx.doi.org/10.3390/s20247206 |
_version_ | 1783628635275526144 |
---|---|
author | Kim, Jinhyuk Cha, Jaekwang Kim, Shiho |
author_facet | Kim, Jinhyuk Cha, Jaekwang Kim, Shiho |
author_sort | Kim, Jinhyuk |
collection | PubMed |
description | The typical configuration of virtual reality (VR) devices consists of a head-mounted display (HMD) and handheld controllers. As such, these units have limited utility in tasks that require hand-free operation, such as in surgical operations or assembly works in cyberspace. We propose a user interface for a VR headset based on a wearer’s facial gestures for hands-free interaction, similar to a touch interface. By sensing and recognizing the expressions associated with the in situ intentional movements of a user’s facial muscles, we define a set of commands that combine predefined facial gestures with head movements. This is achieved by utilizing six pairs of infrared (IR) photocouplers positioned at the foam interface of an HMD. We demonstrate the usability and report on the user experience as well as the performance of the proposed command set using an experimental VR game without any additional controllers. We obtained more than 99% of recognition accuracy for each facial gesture throughout the three steps of experimental tests. The proposed input interface is a cost-effective and efficient solution that facilitates hands-free user operation of a VR headset using built-in infrared photocouplers positioned in the foam interface. The proposed system recognizes facial gestures and incorporates a hands-free user interface to HMD, which is similar to the touch-screen experience of a smartphone. |
format | Online Article Text |
id | pubmed-7766087 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-77660872020-12-28 Hands-Free User Interface for VR Headsets Based on In Situ Facial Gesture Sensing Kim, Jinhyuk Cha, Jaekwang Kim, Shiho Sensors (Basel) Article The typical configuration of virtual reality (VR) devices consists of a head-mounted display (HMD) and handheld controllers. As such, these units have limited utility in tasks that require hand-free operation, such as in surgical operations or assembly works in cyberspace. We propose a user interface for a VR headset based on a wearer’s facial gestures for hands-free interaction, similar to a touch interface. By sensing and recognizing the expressions associated with the in situ intentional movements of a user’s facial muscles, we define a set of commands that combine predefined facial gestures with head movements. This is achieved by utilizing six pairs of infrared (IR) photocouplers positioned at the foam interface of an HMD. We demonstrate the usability and report on the user experience as well as the performance of the proposed command set using an experimental VR game without any additional controllers. We obtained more than 99% of recognition accuracy for each facial gesture throughout the three steps of experimental tests. The proposed input interface is a cost-effective and efficient solution that facilitates hands-free user operation of a VR headset using built-in infrared photocouplers positioned in the foam interface. The proposed system recognizes facial gestures and incorporates a hands-free user interface to HMD, which is similar to the touch-screen experience of a smartphone. MDPI 2020-12-16 /pmc/articles/PMC7766087/ /pubmed/33339247 http://dx.doi.org/10.3390/s20247206 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Kim, Jinhyuk Cha, Jaekwang Kim, Shiho Hands-Free User Interface for VR Headsets Based on In Situ Facial Gesture Sensing |
title | Hands-Free User Interface for VR Headsets Based on In Situ Facial Gesture Sensing |
title_full | Hands-Free User Interface for VR Headsets Based on In Situ Facial Gesture Sensing |
title_fullStr | Hands-Free User Interface for VR Headsets Based on In Situ Facial Gesture Sensing |
title_full_unstemmed | Hands-Free User Interface for VR Headsets Based on In Situ Facial Gesture Sensing |
title_short | Hands-Free User Interface for VR Headsets Based on In Situ Facial Gesture Sensing |
title_sort | hands-free user interface for vr headsets based on in situ facial gesture sensing |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7766087/ https://www.ncbi.nlm.nih.gov/pubmed/33339247 http://dx.doi.org/10.3390/s20247206 |
work_keys_str_mv | AT kimjinhyuk handsfreeuserinterfaceforvrheadsetsbasedoninsitufacialgesturesensing AT chajaekwang handsfreeuserinterfaceforvrheadsetsbasedoninsitufacialgesturesensing AT kimshiho handsfreeuserinterfaceforvrheadsetsbasedoninsitufacialgesturesensing |