Cargando…

AI-Based Soft Module for Safe Human–Robot Interaction towards 4D Printing

Soft robotic modules have potential use for therapeutic and educational purposes. To do so, they need to be safe, soft, smart, and customizable to serve individuals’ different preferences and personalities. A safe modular robotic product made of soft materials, particularly silicon, programmed by ar...

Descripción completa

Detalles Bibliográficos
Autores principales: Zolfagharian, Ali, Khosravani, Mohammad Reza, Duong Vu, Hoang, Nguyen, Minh Khoi, Kouzani, Abbas Z., Bodaghi, Mahdi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9416509/
https://www.ncbi.nlm.nih.gov/pubmed/36015560
http://dx.doi.org/10.3390/polym14163302
_version_ 1784776497859395584
author Zolfagharian, Ali
Khosravani, Mohammad Reza
Duong Vu, Hoang
Nguyen, Minh Khoi
Kouzani, Abbas Z.
Bodaghi, Mahdi
author_facet Zolfagharian, Ali
Khosravani, Mohammad Reza
Duong Vu, Hoang
Nguyen, Minh Khoi
Kouzani, Abbas Z.
Bodaghi, Mahdi
author_sort Zolfagharian, Ali
collection PubMed
description Soft robotic modules have potential use for therapeutic and educational purposes. To do so, they need to be safe, soft, smart, and customizable to serve individuals’ different preferences and personalities. A safe modular robotic product made of soft materials, particularly silicon, programmed by artificial intelligence algorithms and developed via additive manufacturing would be promising. This study focuses on the safe tactile interaction between humans and robots by means of soft material characteristics for translating physical communication to auditory. The embedded vibratory sensors used to stimulate touch senses transmitted through soft materials are presented. The soft module was developed and verified successfully to react to three different patterns of human–robot contact, particularly users’ touches, and then communicate the type of contact with sound. The study develops and verifies a model that can classify different tactile gestures via machine learning algorithms for safe human–robot physical interaction. The system accurately recognizes the gestures and shapes of three-dimensional (3D) printed soft modules. The gestures used for the experiment are the three most common, including slapping, squeezing, and tickling. The model builds on the concept of how safe human–robot physical interactions could help with cognitive and behavioral communication. In this context, the ability to measure, classify, and reflect the behavior of soft materials in robotic modules represents a prerequisite for endowing robotic materials in additive manufacturing for safe interaction with humans.
format Online
Article
Text
id pubmed-9416509
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-94165092022-08-27 AI-Based Soft Module for Safe Human–Robot Interaction towards 4D Printing Zolfagharian, Ali Khosravani, Mohammad Reza Duong Vu, Hoang Nguyen, Minh Khoi Kouzani, Abbas Z. Bodaghi, Mahdi Polymers (Basel) Article Soft robotic modules have potential use for therapeutic and educational purposes. To do so, they need to be safe, soft, smart, and customizable to serve individuals’ different preferences and personalities. A safe modular robotic product made of soft materials, particularly silicon, programmed by artificial intelligence algorithms and developed via additive manufacturing would be promising. This study focuses on the safe tactile interaction between humans and robots by means of soft material characteristics for translating physical communication to auditory. The embedded vibratory sensors used to stimulate touch senses transmitted through soft materials are presented. The soft module was developed and verified successfully to react to three different patterns of human–robot contact, particularly users’ touches, and then communicate the type of contact with sound. The study develops and verifies a model that can classify different tactile gestures via machine learning algorithms for safe human–robot physical interaction. The system accurately recognizes the gestures and shapes of three-dimensional (3D) printed soft modules. The gestures used for the experiment are the three most common, including slapping, squeezing, and tickling. The model builds on the concept of how safe human–robot physical interactions could help with cognitive and behavioral communication. In this context, the ability to measure, classify, and reflect the behavior of soft materials in robotic modules represents a prerequisite for endowing robotic materials in additive manufacturing for safe interaction with humans. MDPI 2022-08-13 /pmc/articles/PMC9416509/ /pubmed/36015560 http://dx.doi.org/10.3390/polym14163302 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Zolfagharian, Ali
Khosravani, Mohammad Reza
Duong Vu, Hoang
Nguyen, Minh Khoi
Kouzani, Abbas Z.
Bodaghi, Mahdi
AI-Based Soft Module for Safe Human–Robot Interaction towards 4D Printing
title AI-Based Soft Module for Safe Human–Robot Interaction towards 4D Printing
title_full AI-Based Soft Module for Safe Human–Robot Interaction towards 4D Printing
title_fullStr AI-Based Soft Module for Safe Human–Robot Interaction towards 4D Printing
title_full_unstemmed AI-Based Soft Module for Safe Human–Robot Interaction towards 4D Printing
title_short AI-Based Soft Module for Safe Human–Robot Interaction towards 4D Printing
title_sort ai-based soft module for safe human–robot interaction towards 4d printing
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9416509/
https://www.ncbi.nlm.nih.gov/pubmed/36015560
http://dx.doi.org/10.3390/polym14163302
work_keys_str_mv AT zolfagharianali aibasedsoftmoduleforsafehumanrobotinteractiontowards4dprinting
AT khosravanimohammadreza aibasedsoftmoduleforsafehumanrobotinteractiontowards4dprinting
AT duongvuhoang aibasedsoftmoduleforsafehumanrobotinteractiontowards4dprinting
AT nguyenminhkhoi aibasedsoftmoduleforsafehumanrobotinteractiontowards4dprinting
AT kouzaniabbasz aibasedsoftmoduleforsafehumanrobotinteractiontowards4dprinting
AT bodaghimahdi aibasedsoftmoduleforsafehumanrobotinteractiontowards4dprinting