Cargando…
Human grasping database for activities of daily living with depth, color and kinematic data streams
This paper presents a grasping database collected from multiple human subjects for activities of daily living in unstructured environments. The main strength of this database is the use of three different sensing modalities: color images from a head-mounted action camera, distance data from a depth...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5972673/ https://www.ncbi.nlm.nih.gov/pubmed/29809171 http://dx.doi.org/10.1038/sdata.2018.101 |
_version_ | 1783326467330932736 |
---|---|
author | Saudabayev, Artur Rysbek, Zhanibek Khassenova, Raykhan Varol, Huseyin Atakan |
author_facet | Saudabayev, Artur Rysbek, Zhanibek Khassenova, Raykhan Varol, Huseyin Atakan |
author_sort | Saudabayev, Artur |
collection | PubMed |
description | This paper presents a grasping database collected from multiple human subjects for activities of daily living in unstructured environments. The main strength of this database is the use of three different sensing modalities: color images from a head-mounted action camera, distance data from a depth sensor on the dominant arm and upper body kinematic data acquired from an inertial motion capture suit. 3826 grasps were identified in the data collected during 9-hours of experiments. The grasps were grouped according to a hierarchical taxonomy into 35 different grasp types. The database contains information related to each grasp and associated sensor data acquired from the three sensor modalities. We also provide our data annotation software written in Matlab as an open-source tool. The size of the database is 172 GB. We believe this database can be used as a stepping stone to develop big data and machine learning techniques for grasping and manipulation with potential applications in rehabilitation robotics and intelligent automation. |
format | Online Article Text |
id | pubmed-5972673 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2018 |
publisher | Nature Publishing Group |
record_format | MEDLINE/PubMed |
spelling | pubmed-59726732018-05-30 Human grasping database for activities of daily living with depth, color and kinematic data streams Saudabayev, Artur Rysbek, Zhanibek Khassenova, Raykhan Varol, Huseyin Atakan Sci Data Data Descriptor This paper presents a grasping database collected from multiple human subjects for activities of daily living in unstructured environments. The main strength of this database is the use of three different sensing modalities: color images from a head-mounted action camera, distance data from a depth sensor on the dominant arm and upper body kinematic data acquired from an inertial motion capture suit. 3826 grasps were identified in the data collected during 9-hours of experiments. The grasps were grouped according to a hierarchical taxonomy into 35 different grasp types. The database contains information related to each grasp and associated sensor data acquired from the three sensor modalities. We also provide our data annotation software written in Matlab as an open-source tool. The size of the database is 172 GB. We believe this database can be used as a stepping stone to develop big data and machine learning techniques for grasping and manipulation with potential applications in rehabilitation robotics and intelligent automation. Nature Publishing Group 2018-05-29 /pmc/articles/PMC5972673/ /pubmed/29809171 http://dx.doi.org/10.1038/sdata.2018.101 Text en Copyright © 2018, The Author(s) http://creativecommons.org/licenses/by/4.0/ Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ The Creative Commons Public Domain Dedication waiver http://creativecommons.org/publicdomain/zero/1.0/ applies to the metadata files made available in this article. |
spellingShingle | Data Descriptor Saudabayev, Artur Rysbek, Zhanibek Khassenova, Raykhan Varol, Huseyin Atakan Human grasping database for activities of daily living with depth, color and kinematic data streams |
title | Human grasping database for activities of daily living with depth, color and kinematic data streams |
title_full | Human grasping database for activities of daily living with depth, color and kinematic data streams |
title_fullStr | Human grasping database for activities of daily living with depth, color and kinematic data streams |
title_full_unstemmed | Human grasping database for activities of daily living with depth, color and kinematic data streams |
title_short | Human grasping database for activities of daily living with depth, color and kinematic data streams |
title_sort | human grasping database for activities of daily living with depth, color and kinematic data streams |
topic | Data Descriptor |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5972673/ https://www.ncbi.nlm.nih.gov/pubmed/29809171 http://dx.doi.org/10.1038/sdata.2018.101 |
work_keys_str_mv | AT saudabayevartur humangraspingdatabaseforactivitiesofdailylivingwithdepthcolorandkinematicdatastreams AT rysbekzhanibek humangraspingdatabaseforactivitiesofdailylivingwithdepthcolorandkinematicdatastreams AT khassenovaraykhan humangraspingdatabaseforactivitiesofdailylivingwithdepthcolorandkinematicdatastreams AT varolhuseyinatakan humangraspingdatabaseforactivitiesofdailylivingwithdepthcolorandkinematicdatastreams |