Cargando…

DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation

Besides direct interaction, human hands are also skilled at using tools to manipulate objects for typical life and work tasks. This paper proposes DeepClaw 2.0 as a low-cost, open-sourced data collection platform for learning human manipulation. We use an RGB-D camera to visually track the motion an...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Haokun, Liu, Xiaobo, Qiu, Nuofan, Guo, Ning, Wan, Fang, Song, Chaoyang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8964492/
https://www.ncbi.nlm.nih.gov/pubmed/35368430
http://dx.doi.org/10.3389/frobt.2022.787291
_version_ 1784678232098865152
author Wang, Haokun
Liu, Xiaobo
Qiu, Nuofan
Guo, Ning
Wan, Fang
Song, Chaoyang
author_facet Wang, Haokun
Liu, Xiaobo
Qiu, Nuofan
Guo, Ning
Wan, Fang
Song, Chaoyang
author_sort Wang, Haokun
collection PubMed
description Besides direct interaction, human hands are also skilled at using tools to manipulate objects for typical life and work tasks. This paper proposes DeepClaw 2.0 as a low-cost, open-sourced data collection platform for learning human manipulation. We use an RGB-D camera to visually track the motion and deformation of a pair of soft finger networks on a modified kitchen tong operated by human teachers. These fingers can be easily integrated with robotic grippers to bridge the structural mismatch between humans and robots during learning. The deformation of soft finger networks, which reveals tactile information in contact-rich manipulation, is captured passively. We collected a comprehensive sample dataset involving five human demonstrators in ten manipulation tasks with five trials per task. As a low-cost, open-sourced platform, we also developed an intuitive interface that converts the raw sensor data into state-action data for imitation learning problems. For learning-by-demonstration problems, we further demonstrated our dataset’s potential by using real robotic hardware to collect joint actuation data or using a simulated environment when limited access to the hardware.
format Online
Article
Text
id pubmed-8964492
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-89644922022-03-31 DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation Wang, Haokun Liu, Xiaobo Qiu, Nuofan Guo, Ning Wan, Fang Song, Chaoyang Front Robot AI Robotics and AI Besides direct interaction, human hands are also skilled at using tools to manipulate objects for typical life and work tasks. This paper proposes DeepClaw 2.0 as a low-cost, open-sourced data collection platform for learning human manipulation. We use an RGB-D camera to visually track the motion and deformation of a pair of soft finger networks on a modified kitchen tong operated by human teachers. These fingers can be easily integrated with robotic grippers to bridge the structural mismatch between humans and robots during learning. The deformation of soft finger networks, which reveals tactile information in contact-rich manipulation, is captured passively. We collected a comprehensive sample dataset involving five human demonstrators in ten manipulation tasks with five trials per task. As a low-cost, open-sourced platform, we also developed an intuitive interface that converts the raw sensor data into state-action data for imitation learning problems. For learning-by-demonstration problems, we further demonstrated our dataset’s potential by using real robotic hardware to collect joint actuation data or using a simulated environment when limited access to the hardware. Frontiers Media S.A. 2022-03-15 /pmc/articles/PMC8964492/ /pubmed/35368430 http://dx.doi.org/10.3389/frobt.2022.787291 Text en Copyright © 2022 Wang, Liu, Qiu, Guo, Wan and Song. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Robotics and AI
Wang, Haokun
Liu, Xiaobo
Qiu, Nuofan
Guo, Ning
Wan, Fang
Song, Chaoyang
DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation
title DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation
title_full DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation
title_fullStr DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation
title_full_unstemmed DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation
title_short DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation
title_sort deepclaw 2.0: a data collection platform for learning human manipulation
topic Robotics and AI
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8964492/
https://www.ncbi.nlm.nih.gov/pubmed/35368430
http://dx.doi.org/10.3389/frobt.2022.787291
work_keys_str_mv AT wanghaokun deepclaw20adatacollectionplatformforlearninghumanmanipulation
AT liuxiaobo deepclaw20adatacollectionplatformforlearninghumanmanipulation
AT qiunuofan deepclaw20adatacollectionplatformforlearninghumanmanipulation
AT guoning deepclaw20adatacollectionplatformforlearninghumanmanipulation
AT wanfang deepclaw20adatacollectionplatformforlearninghumanmanipulation
AT songchaoyang deepclaw20adatacollectionplatformforlearninghumanmanipulation