Cargando…

A Flexible Multimodal Sole Sensor for Legged Robot Sensing Complex Ground Information during Locomotion

Recent achievements in the field of computer vision, reinforcement learning, and locomotion control have largely extended legged robots’ maneuverability in complex natural environments. However, little research focuses on sensing and analyzing the physical properties of the ground, which is crucial...

Descripción completa

Detalles Bibliográficos
Autores principales: Xu, Yingtian, Wang, Ziya, Hao, Wanjun, Zhao, Wenyu, Lin, Waner, Jin, Bingchen, Ding, Ning
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8399010/
https://www.ncbi.nlm.nih.gov/pubmed/34450801
http://dx.doi.org/10.3390/s21165359
_version_ 1783744974648508416
author Xu, Yingtian
Wang, Ziya
Hao, Wanjun
Zhao, Wenyu
Lin, Waner
Jin, Bingchen
Ding, Ning
author_facet Xu, Yingtian
Wang, Ziya
Hao, Wanjun
Zhao, Wenyu
Lin, Waner
Jin, Bingchen
Ding, Ning
author_sort Xu, Yingtian
collection PubMed
description Recent achievements in the field of computer vision, reinforcement learning, and locomotion control have largely extended legged robots’ maneuverability in complex natural environments. However, little research focuses on sensing and analyzing the physical properties of the ground, which is crucial to robots’ locomotion during their interaction with highly irregular profiles, deformable terrains, and slippery surfaces. A biomimetic, flexible, multimodal sole sensor (FMSS) designed for legged robots to identify the ontological status and ground information, such as reaction force mapping, contact situation, terrain, and texture information, to achieve agile maneuvers was innovatively presented in this paper. The FMSS is flexible and large-loaded (20 Pa–800 kPa), designed by integrating a triboelectric sensing coat, embedded piezoelectric sensor, and piezoresistive sensor array. To evaluate the effectiveness and adaptability in different environments, the multimodal sensor was mounted on one of the quadruped robot’s feet and one of the human feet then traversed through different environments in real-world tests. The experiment’s results demonstrated that the FMSS could recognize terrain, texture, hardness, and contact conditions during locomotion effectively and retrain its sensitivity (0.66 kPa(−1)), robustness, and compliance. The presented work indicates the FMSS’s potential to extend the feasibility and dexterity of tactile perception for state estimation and complex scenario detection.
format Online
Article
Text
id pubmed-8399010
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-83990102021-08-29 A Flexible Multimodal Sole Sensor for Legged Robot Sensing Complex Ground Information during Locomotion Xu, Yingtian Wang, Ziya Hao, Wanjun Zhao, Wenyu Lin, Waner Jin, Bingchen Ding, Ning Sensors (Basel) Article Recent achievements in the field of computer vision, reinforcement learning, and locomotion control have largely extended legged robots’ maneuverability in complex natural environments. However, little research focuses on sensing and analyzing the physical properties of the ground, which is crucial to robots’ locomotion during their interaction with highly irregular profiles, deformable terrains, and slippery surfaces. A biomimetic, flexible, multimodal sole sensor (FMSS) designed for legged robots to identify the ontological status and ground information, such as reaction force mapping, contact situation, terrain, and texture information, to achieve agile maneuvers was innovatively presented in this paper. The FMSS is flexible and large-loaded (20 Pa–800 kPa), designed by integrating a triboelectric sensing coat, embedded piezoelectric sensor, and piezoresistive sensor array. To evaluate the effectiveness and adaptability in different environments, the multimodal sensor was mounted on one of the quadruped robot’s feet and one of the human feet then traversed through different environments in real-world tests. The experiment’s results demonstrated that the FMSS could recognize terrain, texture, hardness, and contact conditions during locomotion effectively and retrain its sensitivity (0.66 kPa(−1)), robustness, and compliance. The presented work indicates the FMSS’s potential to extend the feasibility and dexterity of tactile perception for state estimation and complex scenario detection. MDPI 2021-08-09 /pmc/articles/PMC8399010/ /pubmed/34450801 http://dx.doi.org/10.3390/s21165359 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Xu, Yingtian
Wang, Ziya
Hao, Wanjun
Zhao, Wenyu
Lin, Waner
Jin, Bingchen
Ding, Ning
A Flexible Multimodal Sole Sensor for Legged Robot Sensing Complex Ground Information during Locomotion
title A Flexible Multimodal Sole Sensor for Legged Robot Sensing Complex Ground Information during Locomotion
title_full A Flexible Multimodal Sole Sensor for Legged Robot Sensing Complex Ground Information during Locomotion
title_fullStr A Flexible Multimodal Sole Sensor for Legged Robot Sensing Complex Ground Information during Locomotion
title_full_unstemmed A Flexible Multimodal Sole Sensor for Legged Robot Sensing Complex Ground Information during Locomotion
title_short A Flexible Multimodal Sole Sensor for Legged Robot Sensing Complex Ground Information during Locomotion
title_sort flexible multimodal sole sensor for legged robot sensing complex ground information during locomotion
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8399010/
https://www.ncbi.nlm.nih.gov/pubmed/34450801
http://dx.doi.org/10.3390/s21165359
work_keys_str_mv AT xuyingtian aflexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT wangziya aflexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT haowanjun aflexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT zhaowenyu aflexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT linwaner aflexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT jinbingchen aflexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT dingning aflexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT xuyingtian flexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT wangziya flexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT haowanjun flexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT zhaowenyu flexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT linwaner flexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT jinbingchen flexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion
AT dingning flexiblemultimodalsolesensorforleggedrobotsensingcomplexgroundinformationduringlocomotion