Cargando…

Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition

In this study, the design of a Digital-twin human-machine interface sensor (DT-HMIS) is proposed. This is a digital-twin sensor (DT-Sensor) that can meet the demands of human-machine automation collaboration in Industry 5.0. The DT-HMIS allows users/patients to add, modify, delete, query, and restor...

Descripción completa

Detalles Bibliográficos
Autores principales: Mo, Dong-Han, Tien, Chuen-Lin, Yeh, Yu-Ling, Guo, Yi-Ru, Lin, Chern-Sheng, Chen, Chih-Chin, Chang, Che-Ming
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10098945/
https://www.ncbi.nlm.nih.gov/pubmed/37050567
http://dx.doi.org/10.3390/s23073509
_version_ 1785024937024552960
author Mo, Dong-Han
Tien, Chuen-Lin
Yeh, Yu-Ling
Guo, Yi-Ru
Lin, Chern-Sheng
Chen, Chih-Chin
Chang, Che-Ming
author_facet Mo, Dong-Han
Tien, Chuen-Lin
Yeh, Yu-Ling
Guo, Yi-Ru
Lin, Chern-Sheng
Chen, Chih-Chin
Chang, Che-Ming
author_sort Mo, Dong-Han
collection PubMed
description In this study, the design of a Digital-twin human-machine interface sensor (DT-HMIS) is proposed. This is a digital-twin sensor (DT-Sensor) that can meet the demands of human-machine automation collaboration in Industry 5.0. The DT-HMIS allows users/patients to add, modify, delete, query, and restore their previously memorized DT finger gesture mapping model and programmable logic controller (PLC) logic program, enabling the operation or access of the programmable controller input-output (I/O) interface and achieving the extended limb collaboration capability of users/patients. The system has two main functions: the first is gesture-encoded virtual manipulation, which indirectly accesses the PLC through the DT mapping model to complete control of electronic peripherals for extension-limbs ability by executing logic control program instructions. The second is gesture-based virtual manipulation to help non-verbal individuals create special verbal sentences through gesture commands to improve their expression ability. The design method uses primitive image processing and eight-way dual-bit signal processing algorithms to capture the movement of human finger gestures and convert them into digital signals. The system service maps control instructions by observing the digital signals of the DT-HMIS and drives motion control through mechatronics integration or speech synthesis feedback to express the operation requirements of inconvenient work or complex handheld physical tools. Based on the human-machine interface sensor of DT computer vision, it can reflect the user’s command status without the need for additional wearable devices and promote interaction with the virtual world. When used for patients, the system ensures that the user’s virtual control is mapped to physical device control, providing the convenience of independent operation while reducing caregiver fatigue. This study shows that the recognition accuracy can reach 99%, demonstrating practicality and application prospects. In future applications, users/patients can interact virtually with other peripheral devices through the DT-HMIS to meet their own interaction needs and promote industry progress.
format Online
Article
Text
id pubmed-10098945
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-100989452023-04-14 Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition Mo, Dong-Han Tien, Chuen-Lin Yeh, Yu-Ling Guo, Yi-Ru Lin, Chern-Sheng Chen, Chih-Chin Chang, Che-Ming Sensors (Basel) Article In this study, the design of a Digital-twin human-machine interface sensor (DT-HMIS) is proposed. This is a digital-twin sensor (DT-Sensor) that can meet the demands of human-machine automation collaboration in Industry 5.0. The DT-HMIS allows users/patients to add, modify, delete, query, and restore their previously memorized DT finger gesture mapping model and programmable logic controller (PLC) logic program, enabling the operation or access of the programmable controller input-output (I/O) interface and achieving the extended limb collaboration capability of users/patients. The system has two main functions: the first is gesture-encoded virtual manipulation, which indirectly accesses the PLC through the DT mapping model to complete control of electronic peripherals for extension-limbs ability by executing logic control program instructions. The second is gesture-based virtual manipulation to help non-verbal individuals create special verbal sentences through gesture commands to improve their expression ability. The design method uses primitive image processing and eight-way dual-bit signal processing algorithms to capture the movement of human finger gestures and convert them into digital signals. The system service maps control instructions by observing the digital signals of the DT-HMIS and drives motion control through mechatronics integration or speech synthesis feedback to express the operation requirements of inconvenient work or complex handheld physical tools. Based on the human-machine interface sensor of DT computer vision, it can reflect the user’s command status without the need for additional wearable devices and promote interaction with the virtual world. When used for patients, the system ensures that the user’s virtual control is mapped to physical device control, providing the convenience of independent operation while reducing caregiver fatigue. This study shows that the recognition accuracy can reach 99%, demonstrating practicality and application prospects. In future applications, users/patients can interact virtually with other peripheral devices through the DT-HMIS to meet their own interaction needs and promote industry progress. MDPI 2023-03-27 /pmc/articles/PMC10098945/ /pubmed/37050567 http://dx.doi.org/10.3390/s23073509 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Mo, Dong-Han
Tien, Chuen-Lin
Yeh, Yu-Ling
Guo, Yi-Ru
Lin, Chern-Sheng
Chen, Chih-Chin
Chang, Che-Ming
Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition
title Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition
title_full Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition
title_fullStr Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition
title_full_unstemmed Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition
title_short Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition
title_sort design of digital-twin human-machine interface sensor with intelligent finger gesture recognition
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10098945/
https://www.ncbi.nlm.nih.gov/pubmed/37050567
http://dx.doi.org/10.3390/s23073509
work_keys_str_mv AT modonghan designofdigitaltwinhumanmachineinterfacesensorwithintelligentfingergesturerecognition
AT tienchuenlin designofdigitaltwinhumanmachineinterfacesensorwithintelligentfingergesturerecognition
AT yehyuling designofdigitaltwinhumanmachineinterfacesensorwithintelligentfingergesturerecognition
AT guoyiru designofdigitaltwinhumanmachineinterfacesensorwithintelligentfingergesturerecognition
AT linchernsheng designofdigitaltwinhumanmachineinterfacesensorwithintelligentfingergesturerecognition
AT chenchihchin designofdigitaltwinhumanmachineinterfacesensorwithintelligentfingergesturerecognition
AT changcheming designofdigitaltwinhumanmachineinterfacesensorwithintelligentfingergesturerecognition