Cargando…
Human Motion Understanding for Selecting Action Timing in Collaborative Human-Robot Interaction
In the industry of the future, so as in healthcare and at home, robots will be a familiar presence. Since they will be working closely with human operators not always properly trained for human-machine interaction tasks, robots will need the ability of automatically adapting to changes in the task t...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7805633/ https://www.ncbi.nlm.nih.gov/pubmed/33501073 http://dx.doi.org/10.3389/frobt.2019.00058 |
_version_ | 1783636344296177664 |
---|---|
author | Rea, Francesco Vignolo, Alessia Sciutti, Alessandra Noceti, Nicoletta |
author_facet | Rea, Francesco Vignolo, Alessia Sciutti, Alessandra Noceti, Nicoletta |
author_sort | Rea, Francesco |
collection | PubMed |
description | In the industry of the future, so as in healthcare and at home, robots will be a familiar presence. Since they will be working closely with human operators not always properly trained for human-machine interaction tasks, robots will need the ability of automatically adapting to changes in the task to be performed or to cope with variations in how the human partner completes the task. The goal of this work is to make a further step toward endowing robot with such capability. To this purpose, we focus on the identification of relevant time instants in an observed action, called dynamic instants, informative on the partner's movement timing, and marking instants where an action starts or ends, or changes to another action. The time instants are temporal locations where the motion can be ideally segmented, providing a set of primitives that can be used to build a temporal signature of the action and finally support the understanding of the dynamics and coordination in time. We validate our approach in two contexts, considering first a situation in which the human partner can perform multiple different activities, and then moving to settings where an action is already recognized and shows a certain degree of periodicity. In the two contexts we address different challenges. In the first one, working in batch on a dataset collecting videos of a variety of cooking activities, we investigate whether the action signature we compute could facilitate the understanding of which type of action is occurring in front of the observer, with tolerance to viewpoint changes. In the second context, we evaluate online on the robot iCub the capability of the action signature in providing hints to establish an actual temporal coordination during the interaction with human participants. In both cases, we show promising results that speak in favor of the potentiality of our approach. |
format | Online Article Text |
id | pubmed-7805633 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-78056332021-01-25 Human Motion Understanding for Selecting Action Timing in Collaborative Human-Robot Interaction Rea, Francesco Vignolo, Alessia Sciutti, Alessandra Noceti, Nicoletta Front Robot AI Robotics and AI In the industry of the future, so as in healthcare and at home, robots will be a familiar presence. Since they will be working closely with human operators not always properly trained for human-machine interaction tasks, robots will need the ability of automatically adapting to changes in the task to be performed or to cope with variations in how the human partner completes the task. The goal of this work is to make a further step toward endowing robot with such capability. To this purpose, we focus on the identification of relevant time instants in an observed action, called dynamic instants, informative on the partner's movement timing, and marking instants where an action starts or ends, or changes to another action. The time instants are temporal locations where the motion can be ideally segmented, providing a set of primitives that can be used to build a temporal signature of the action and finally support the understanding of the dynamics and coordination in time. We validate our approach in two contexts, considering first a situation in which the human partner can perform multiple different activities, and then moving to settings where an action is already recognized and shows a certain degree of periodicity. In the two contexts we address different challenges. In the first one, working in batch on a dataset collecting videos of a variety of cooking activities, we investigate whether the action signature we compute could facilitate the understanding of which type of action is occurring in front of the observer, with tolerance to viewpoint changes. In the second context, we evaluate online on the robot iCub the capability of the action signature in providing hints to establish an actual temporal coordination during the interaction with human participants. In both cases, we show promising results that speak in favor of the potentiality of our approach. Frontiers Media S.A. 2019-07-16 /pmc/articles/PMC7805633/ /pubmed/33501073 http://dx.doi.org/10.3389/frobt.2019.00058 Text en Copyright © 2019 Rea, Vignolo, Sciutti and Noceti. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Robotics and AI Rea, Francesco Vignolo, Alessia Sciutti, Alessandra Noceti, Nicoletta Human Motion Understanding for Selecting Action Timing in Collaborative Human-Robot Interaction |
title | Human Motion Understanding for Selecting Action Timing in Collaborative Human-Robot Interaction |
title_full | Human Motion Understanding for Selecting Action Timing in Collaborative Human-Robot Interaction |
title_fullStr | Human Motion Understanding for Selecting Action Timing in Collaborative Human-Robot Interaction |
title_full_unstemmed | Human Motion Understanding for Selecting Action Timing in Collaborative Human-Robot Interaction |
title_short | Human Motion Understanding for Selecting Action Timing in Collaborative Human-Robot Interaction |
title_sort | human motion understanding for selecting action timing in collaborative human-robot interaction |
topic | Robotics and AI |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7805633/ https://www.ncbi.nlm.nih.gov/pubmed/33501073 http://dx.doi.org/10.3389/frobt.2019.00058 |
work_keys_str_mv | AT reafrancesco humanmotionunderstandingforselectingactiontimingincollaborativehumanrobotinteraction AT vignoloalessia humanmotionunderstandingforselectingactiontimingincollaborativehumanrobotinteraction AT sciuttialessandra humanmotionunderstandingforselectingactiontimingincollaborativehumanrobotinteraction AT nocetinicoletta humanmotionunderstandingforselectingactiontimingincollaborativehumanrobotinteraction |