Cargando…

HANDS: an RGB-D dataset of static hand-gestures for human-robot interaction

The HANDS dataset has been created for human-robot interaction research, and it is composed of spatially and temporally aligned RGB and Depth frames. It contains 12 static single-hand gestures performed with both the right-hand and the left-hand, and 3 static two-hands gestures for a total of 29 uni...

Descripción completa

Detalles Bibliográficos
Autores principales: Nuzzi, Cristina, Pasinetti, Simone, Pagani, Roberto, Coffetti, Gabriele, Sansoni, Giovanna
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Elsevier 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7873347/
https://www.ncbi.nlm.nih.gov/pubmed/33604423
http://dx.doi.org/10.1016/j.dib.2021.106791
Descripción
Sumario:The HANDS dataset has been created for human-robot interaction research, and it is composed of spatially and temporally aligned RGB and Depth frames. It contains 12 static single-hand gestures performed with both the right-hand and the left-hand, and 3 static two-hands gestures for a total of 29 unique classes. Five actors (two females and three males) have been acquired performing the gestures, each of them adopting a different background and light conditions. For each actor, 150 RGB frames and their corresponding 150 Depth frames per gesture have been collected, for a total of 2400 RGB frames and 2400 Depth frames per actor. Data has been collected using a Kinect v2 camera intrinsically calibrated to spatially align RGB data to Depth data. The temporal alignment has been performed offline using MATLAB, aligning frames with a maximum temporal distance of 66  ms. This dataset has been used in [1] and it is freely available at http://dx.doi.org/10.17632/ndrczc35bt.1.