Cargando…
SFU-store-nav: A multimodal dataset for indoor human navigation
This article describes a dataset collected in a set of experiments that involves human participants and a robot. The set of experiments was conducted in the computing science robotics lab in Simon Fraser University, Burnaby, BC, Canada, and its aim is to gather data containing common gestures, movem...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Elsevier
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7691721/ https://www.ncbi.nlm.nih.gov/pubmed/33294527 http://dx.doi.org/10.1016/j.dib.2020.106539 |
_version_ | 1783614354127585280 |
---|---|
author | Zhang, Zhitian Rhim, Jimin TaherAhmadi, Mahdi Yang, Kefan Lim, Angelica Chen, Mo |
author_facet | Zhang, Zhitian Rhim, Jimin TaherAhmadi, Mahdi Yang, Kefan Lim, Angelica Chen, Mo |
author_sort | Zhang, Zhitian |
collection | PubMed |
description | This article describes a dataset collected in a set of experiments that involves human participants and a robot. The set of experiments was conducted in the computing science robotics lab in Simon Fraser University, Burnaby, BC, Canada, and its aim is to gather data containing common gestures, movements, and other behaviours that may indicate humans’ navigational intent relevant for autonomous robot navigation. The experiment simulates a shopping scenario where human participants come in to pick up items from his/her shopping list and interact with a Pepper robot that is programmed to help the human participant. We collected visual data and motion capture data from 108 human participants. The visual data contains live recordings of the experiments and the motion capture data contains the position and orientation of the human participants in world coordinates. This dataset could be valuable for researchers in the robotics, machine learning and computer vision community. |
format | Online Article Text |
id | pubmed-7691721 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Elsevier |
record_format | MEDLINE/PubMed |
spelling | pubmed-76917212020-12-07 SFU-store-nav: A multimodal dataset for indoor human navigation Zhang, Zhitian Rhim, Jimin TaherAhmadi, Mahdi Yang, Kefan Lim, Angelica Chen, Mo Data Brief Data Article This article describes a dataset collected in a set of experiments that involves human participants and a robot. The set of experiments was conducted in the computing science robotics lab in Simon Fraser University, Burnaby, BC, Canada, and its aim is to gather data containing common gestures, movements, and other behaviours that may indicate humans’ navigational intent relevant for autonomous robot navigation. The experiment simulates a shopping scenario where human participants come in to pick up items from his/her shopping list and interact with a Pepper robot that is programmed to help the human participant. We collected visual data and motion capture data from 108 human participants. The visual data contains live recordings of the experiments and the motion capture data contains the position and orientation of the human participants in world coordinates. This dataset could be valuable for researchers in the robotics, machine learning and computer vision community. Elsevier 2020-11-18 /pmc/articles/PMC7691721/ /pubmed/33294527 http://dx.doi.org/10.1016/j.dib.2020.106539 Text en © 2020 The Authors http://creativecommons.org/licenses/by/4.0/ This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Data Article Zhang, Zhitian Rhim, Jimin TaherAhmadi, Mahdi Yang, Kefan Lim, Angelica Chen, Mo SFU-store-nav: A multimodal dataset for indoor human navigation |
title | SFU-store-nav: A multimodal dataset for indoor human navigation |
title_full | SFU-store-nav: A multimodal dataset for indoor human navigation |
title_fullStr | SFU-store-nav: A multimodal dataset for indoor human navigation |
title_full_unstemmed | SFU-store-nav: A multimodal dataset for indoor human navigation |
title_short | SFU-store-nav: A multimodal dataset for indoor human navigation |
title_sort | sfu-store-nav: a multimodal dataset for indoor human navigation |
topic | Data Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7691721/ https://www.ncbi.nlm.nih.gov/pubmed/33294527 http://dx.doi.org/10.1016/j.dib.2020.106539 |
work_keys_str_mv | AT zhangzhitian sfustorenavamultimodaldatasetforindoorhumannavigation AT rhimjimin sfustorenavamultimodaldatasetforindoorhumannavigation AT taherahmadimahdi sfustorenavamultimodaldatasetforindoorhumannavigation AT yangkefan sfustorenavamultimodaldatasetforindoorhumannavigation AT limangelica sfustorenavamultimodaldatasetforindoorhumannavigation AT chenmo sfustorenavamultimodaldatasetforindoorhumannavigation |