Cargando…

Robotic Manipulation under Harsh Conditions Using Self‐Healing Silk‐Based Iontronics

Progress toward intelligent human–robotic interactions requires monitoring sensors that are mechanically flexible, facile to implement, and able to harness recognition capability under harsh environments. Conventional sensing methods have been divided for human‐side collection or robot‐side feedback...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Mengwei, Zhang, Yujia, Zhang, Yanghong, Zhou, Zhitao, Qin, Nan, Tao, Tiger H.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley and Sons Inc. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8805592/
https://www.ncbi.nlm.nih.gov/pubmed/34738735
http://dx.doi.org/10.1002/advs.202102596
Descripción
Sumario:Progress toward intelligent human–robotic interactions requires monitoring sensors that are mechanically flexible, facile to implement, and able to harness recognition capability under harsh environments. Conventional sensing methods have been divided for human‐side collection or robot‐side feedback and are not designed with these criteria in mind. However, the iontronic polymer is an example of a general method that operates properly on both human skin (commonly known as skin electronics or iontronics) and the machine/robotic surface. Here, a unique iontronic composite (silk protein/glycerol/Ca(II) ion) and supportive molecular mechanism are developed to simultaneously achieve high conductivity (around 6 kΩ at 50 kHz), self‐healing (within minutes), strong stretchability (around 1000%), high strain sensitivity and transparency, and universal adhesiveness across a broad working temperature range (−40–120 °C). Those merits facilitate the development of iontronic sensing and the implementation of damage‐resilient robotic manipulation. Combined with a machine learning algorithm and specified data collection methods, the system is able to classify 1024 types of human and robot hand gestures under challenging scenarios and to offer excellent object recognition with an accuracy of 99.7%.