Cargando…

Vision-Guided MPC for Robotic Path Following Using Learned Memory-Augmented Model

The control of the interaction between the robot and environment, following a predefined geometric surface path with high accuracy, is a fundamental problem for contact-rich tasks such as machining, polishing, or grinding. Flexible path-following control presents numerous applications in emerging in...

Descripción completa

Detalles Bibliográficos
Autores principales: Rastegarpanah, Alireza, Hathaway, Jamie, Stolkin, Rustam
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8350735/
https://www.ncbi.nlm.nih.gov/pubmed/34381821
http://dx.doi.org/10.3389/frobt.2021.688275
_version_ 1783735834300645376
author Rastegarpanah, Alireza
Hathaway, Jamie
Stolkin, Rustam
author_facet Rastegarpanah, Alireza
Hathaway, Jamie
Stolkin, Rustam
author_sort Rastegarpanah, Alireza
collection PubMed
description The control of the interaction between the robot and environment, following a predefined geometric surface path with high accuracy, is a fundamental problem for contact-rich tasks such as machining, polishing, or grinding. Flexible path-following control presents numerous applications in emerging industry fields such as disassembly and recycling, where the control system must adapt to a range of dissimilar object classes, where the properties of the environment are uncertain. We present an end-to-end framework for trajectory-independent robotic path following for contact-rich tasks in the presence of parametric uncertainties. We formulate a combination of model predictive control with image-based path planning and real-time visual feedback, based on a learned state-space dynamic model. For modeling the dynamics of the robot-environment system during contact, we introduce the application of the differentiable neural computer, a type of memory augmented neural network (MANN). Although MANNs have been as yet unexplored in a control context, we demonstrate a reduction in RMS error of [Formula: see text] 21.0% compared with an equivalent Long Short-Term Memory (LSTM) architecture. Our framework was validated in simulation, demonstrating the ability to generalize to materials previously unseen in the training dataset.
format Online
Article
Text
id pubmed-8350735
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-83507352021-08-10 Vision-Guided MPC for Robotic Path Following Using Learned Memory-Augmented Model Rastegarpanah, Alireza Hathaway, Jamie Stolkin, Rustam Front Robot AI Robotics and AI The control of the interaction between the robot and environment, following a predefined geometric surface path with high accuracy, is a fundamental problem for contact-rich tasks such as machining, polishing, or grinding. Flexible path-following control presents numerous applications in emerging industry fields such as disassembly and recycling, where the control system must adapt to a range of dissimilar object classes, where the properties of the environment are uncertain. We present an end-to-end framework for trajectory-independent robotic path following for contact-rich tasks in the presence of parametric uncertainties. We formulate a combination of model predictive control with image-based path planning and real-time visual feedback, based on a learned state-space dynamic model. For modeling the dynamics of the robot-environment system during contact, we introduce the application of the differentiable neural computer, a type of memory augmented neural network (MANN). Although MANNs have been as yet unexplored in a control context, we demonstrate a reduction in RMS error of [Formula: see text] 21.0% compared with an equivalent Long Short-Term Memory (LSTM) architecture. Our framework was validated in simulation, demonstrating the ability to generalize to materials previously unseen in the training dataset. Frontiers Media S.A. 2021-07-26 /pmc/articles/PMC8350735/ /pubmed/34381821 http://dx.doi.org/10.3389/frobt.2021.688275 Text en Copyright © 2021 Rastegarpanah, Hathaway and Stolkin. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Robotics and AI
Rastegarpanah, Alireza
Hathaway, Jamie
Stolkin, Rustam
Vision-Guided MPC for Robotic Path Following Using Learned Memory-Augmented Model
title Vision-Guided MPC for Robotic Path Following Using Learned Memory-Augmented Model
title_full Vision-Guided MPC for Robotic Path Following Using Learned Memory-Augmented Model
title_fullStr Vision-Guided MPC for Robotic Path Following Using Learned Memory-Augmented Model
title_full_unstemmed Vision-Guided MPC for Robotic Path Following Using Learned Memory-Augmented Model
title_short Vision-Guided MPC for Robotic Path Following Using Learned Memory-Augmented Model
title_sort vision-guided mpc for robotic path following using learned memory-augmented model
topic Robotics and AI
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8350735/
https://www.ncbi.nlm.nih.gov/pubmed/34381821
http://dx.doi.org/10.3389/frobt.2021.688275
work_keys_str_mv AT rastegarpanahalireza visionguidedmpcforroboticpathfollowingusinglearnedmemoryaugmentedmodel
AT hathawayjamie visionguidedmpcforroboticpathfollowingusinglearnedmemoryaugmentedmodel
AT stolkinrustam visionguidedmpcforroboticpathfollowingusinglearnedmemoryaugmentedmodel