Cargando…
OHO: A Multi-Modal, Multi-Purpose Dataset for Human-Robot Object Hand-Over
In the context of collaborative robotics, handing over hand-held objects to a robot is a safety-critical task. Therefore, a robust distinction between human hands and presented objects in image data is essential to avoid contact with robotic grippers. To be able to develop machine learning methods f...
Autores principales: | Stephan, Benedict, Köhler, Mona, Müller, Steffen, Zhang, Yan, Gross, Horst-Michael, Notni, Gunther |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10537499/ https://www.ncbi.nlm.nih.gov/pubmed/37765862 http://dx.doi.org/10.3390/s23187807 |
Ejemplares similares
-
Point Cloud Hand–Object Segmentation Using Multimodal Imaging with Thermal and Color Data for Safe Robotic Object Handover
por: Zhang, Yan, et al.
Publicado: (2021) -
KEK Oho '87 High-energy Accelerator Seminars
Publicado: (1987) -
From Multi-Modal Property Dataset to Robot-Centric Conceptual Knowledge About Household Objects
por: Thosar, Madhura, et al.
Publicado: (2021) -
A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots
por: Müller, Steffen, et al.
Publicado: (2020) -
An Embedded, Multi-Modal Sensor System for Scalable Robotic and Prosthetic Hand Fingers
por: Weiner, Pascal, et al.
Publicado: (2019)