Cargando…
What is it like to be a bot? Variable perspective embodied telepresence for crowdsourcing robot movements
Movement and embodiment are communicative affordances central to social robotics, but designing embodied movements for robots often requires extensive knowledge of both robotics and movement theory. More accessible methods such as learning from demonstration often rely on physical access to the robo...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer London
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9068224/ https://www.ncbi.nlm.nih.gov/pubmed/35528273 http://dx.doi.org/10.1007/s00779-022-01684-y |
_version_ | 1784700181731606528 |
---|---|
author | Suguitan, Michael Hoffman, Guy |
author_facet | Suguitan, Michael Hoffman, Guy |
author_sort | Suguitan, Michael |
collection | PubMed |
description | Movement and embodiment are communicative affordances central to social robotics, but designing embodied movements for robots often requires extensive knowledge of both robotics and movement theory. More accessible methods such as learning from demonstration often rely on physical access to the robot which is usually limited to research settings. Machine learning (ML) algorithms can complement hand-crafted or learned movements by generating new behaviors, but this requires large and diverse training datasets, which are hard to come by. In this work, we propose an embodied telepresence system for remotely crowdsourcing emotive robot movement samples that can serve as ML training data. Remote users control the robot through the internet using the motion sensors in their smartphones and view the movement either from a first-person or a third-person perspective. We evaluated the system in an online study where users created emotive movements for the robot and rated their experience. We then utilized the user-crafted movements as inputs to a neural network to generate new movements. We found that users strongly preferred the third-person perspective and that the ML-generated movements are largely comparable to the user-crafted movements. This work supports the usability of telepresence robots as a movement crowdsourcing platform. |
format | Online Article Text |
id | pubmed-9068224 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Springer London |
record_format | MEDLINE/PubMed |
spelling | pubmed-90682242022-05-04 What is it like to be a bot? Variable perspective embodied telepresence for crowdsourcing robot movements Suguitan, Michael Hoffman, Guy Pers Ubiquitous Comput Original Paper Movement and embodiment are communicative affordances central to social robotics, but designing embodied movements for robots often requires extensive knowledge of both robotics and movement theory. More accessible methods such as learning from demonstration often rely on physical access to the robot which is usually limited to research settings. Machine learning (ML) algorithms can complement hand-crafted or learned movements by generating new behaviors, but this requires large and diverse training datasets, which are hard to come by. In this work, we propose an embodied telepresence system for remotely crowdsourcing emotive robot movement samples that can serve as ML training data. Remote users control the robot through the internet using the motion sensors in their smartphones and view the movement either from a first-person or a third-person perspective. We evaluated the system in an online study where users created emotive movements for the robot and rated their experience. We then utilized the user-crafted movements as inputs to a neural network to generate new movements. We found that users strongly preferred the third-person perspective and that the ML-generated movements are largely comparable to the user-crafted movements. This work supports the usability of telepresence robots as a movement crowdsourcing platform. Springer London 2022-05-04 2023 /pmc/articles/PMC9068224/ /pubmed/35528273 http://dx.doi.org/10.1007/s00779-022-01684-y Text en © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2022 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic. |
spellingShingle | Original Paper Suguitan, Michael Hoffman, Guy What is it like to be a bot? Variable perspective embodied telepresence for crowdsourcing robot movements |
title | What is it like to be a bot? Variable perspective embodied telepresence for crowdsourcing robot movements |
title_full | What is it like to be a bot? Variable perspective embodied telepresence for crowdsourcing robot movements |
title_fullStr | What is it like to be a bot? Variable perspective embodied telepresence for crowdsourcing robot movements |
title_full_unstemmed | What is it like to be a bot? Variable perspective embodied telepresence for crowdsourcing robot movements |
title_short | What is it like to be a bot? Variable perspective embodied telepresence for crowdsourcing robot movements |
title_sort | what is it like to be a bot? variable perspective embodied telepresence for crowdsourcing robot movements |
topic | Original Paper |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9068224/ https://www.ncbi.nlm.nih.gov/pubmed/35528273 http://dx.doi.org/10.1007/s00779-022-01684-y |
work_keys_str_mv | AT suguitanmichael whatisitliketobeabotvariableperspectiveembodiedtelepresenceforcrowdsourcingrobotmovements AT hoffmanguy whatisitliketobeabotvariableperspectiveembodiedtelepresenceforcrowdsourcingrobotmovements |