Cargando…

A Model-Based System for Real-Time Articulated Hand Tracking Using a Simple Data Glove and a Depth Camera

Tracking detailed hand motion is a fundamental research topic in the area of human-computer interaction (HCI) and has been widely studied for decades. Existing solutions with single-model inputs either require tedious calibration, are expensive or lack sufficient robustness and accuracy due to occlu...

Descripción completa

Detalles Bibliográficos
Autores principales: Jiang, Linjun, Xia, Hailun, Guo, Caili
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6865016/
https://www.ncbi.nlm.nih.gov/pubmed/31661877
http://dx.doi.org/10.3390/s19214680
_version_ 1783472011220090880
author Jiang, Linjun
Xia, Hailun
Guo, Caili
author_facet Jiang, Linjun
Xia, Hailun
Guo, Caili
author_sort Jiang, Linjun
collection PubMed
description Tracking detailed hand motion is a fundamental research topic in the area of human-computer interaction (HCI) and has been widely studied for decades. Existing solutions with single-model inputs either require tedious calibration, are expensive or lack sufficient robustness and accuracy due to occlusions. In this study, we present a real-time system to reconstruct the exact hand motion by iteratively fitting a triangular mesh model to the absolute measurement of hand from a depth camera under the robust restriction of a simple data glove. We redefine and simplify the function of the data glove to lighten its limitations, i.e., tedious calibration, cumbersome equipment, and hampering movement and keep our system lightweight. For accurate hand tracking, we introduce a new set of degrees of freedom (DoFs), a shape adjustment term for personalizing the triangular mesh model, and an adaptive collision term to prevent self-intersection. For efficiency, we extract a strong pose-space prior to the data glove to narrow the pose searching space. We also present a simplified approach for computing tracking correspondences without the loss of accuracy to reduce computation cost. Quantitative experiments show the comparable or increased accuracy of our system over the state-of-the-art with about 40% improvement in robustness. Besides, our system runs independent of Graphic Processing Unit (GPU) and reaches 40 frames per second (FPS) at about 25% Central Processing Unit (CPU) usage.
format Online
Article
Text
id pubmed-6865016
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-68650162019-12-06 A Model-Based System for Real-Time Articulated Hand Tracking Using a Simple Data Glove and a Depth Camera Jiang, Linjun Xia, Hailun Guo, Caili Sensors (Basel) Article Tracking detailed hand motion is a fundamental research topic in the area of human-computer interaction (HCI) and has been widely studied for decades. Existing solutions with single-model inputs either require tedious calibration, are expensive or lack sufficient robustness and accuracy due to occlusions. In this study, we present a real-time system to reconstruct the exact hand motion by iteratively fitting a triangular mesh model to the absolute measurement of hand from a depth camera under the robust restriction of a simple data glove. We redefine and simplify the function of the data glove to lighten its limitations, i.e., tedious calibration, cumbersome equipment, and hampering movement and keep our system lightweight. For accurate hand tracking, we introduce a new set of degrees of freedom (DoFs), a shape adjustment term for personalizing the triangular mesh model, and an adaptive collision term to prevent self-intersection. For efficiency, we extract a strong pose-space prior to the data glove to narrow the pose searching space. We also present a simplified approach for computing tracking correspondences without the loss of accuracy to reduce computation cost. Quantitative experiments show the comparable or increased accuracy of our system over the state-of-the-art with about 40% improvement in robustness. Besides, our system runs independent of Graphic Processing Unit (GPU) and reaches 40 frames per second (FPS) at about 25% Central Processing Unit (CPU) usage. MDPI 2019-10-28 /pmc/articles/PMC6865016/ /pubmed/31661877 http://dx.doi.org/10.3390/s19214680 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Jiang, Linjun
Xia, Hailun
Guo, Caili
A Model-Based System for Real-Time Articulated Hand Tracking Using a Simple Data Glove and a Depth Camera
title A Model-Based System for Real-Time Articulated Hand Tracking Using a Simple Data Glove and a Depth Camera
title_full A Model-Based System for Real-Time Articulated Hand Tracking Using a Simple Data Glove and a Depth Camera
title_fullStr A Model-Based System for Real-Time Articulated Hand Tracking Using a Simple Data Glove and a Depth Camera
title_full_unstemmed A Model-Based System for Real-Time Articulated Hand Tracking Using a Simple Data Glove and a Depth Camera
title_short A Model-Based System for Real-Time Articulated Hand Tracking Using a Simple Data Glove and a Depth Camera
title_sort model-based system for real-time articulated hand tracking using a simple data glove and a depth camera
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6865016/
https://www.ncbi.nlm.nih.gov/pubmed/31661877
http://dx.doi.org/10.3390/s19214680
work_keys_str_mv AT jianglinjun amodelbasedsystemforrealtimearticulatedhandtrackingusingasimpledatagloveandadepthcamera
AT xiahailun amodelbasedsystemforrealtimearticulatedhandtrackingusingasimpledatagloveandadepthcamera
AT guocaili amodelbasedsystemforrealtimearticulatedhandtrackingusingasimpledatagloveandadepthcamera
AT jianglinjun modelbasedsystemforrealtimearticulatedhandtrackingusingasimpledatagloveandadepthcamera
AT xiahailun modelbasedsystemforrealtimearticulatedhandtrackingusingasimpledatagloveandadepthcamera
AT guocaili modelbasedsystemforrealtimearticulatedhandtrackingusingasimpledatagloveandadepthcamera