Cargando…

A touch-free human-robot collaborative surgical navigation robotic system based on hand gesture recognition

Robot-assisted minimally invasive surgery (RAMIS) has gained significant traction in clinical practice in recent years. However, most surgical robots rely on touch-based human-robot interaction (HRI), which increases the risk of bacterial diffusion. This risk is particularly concerning when surgeons...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Jie, Zhang, Xinkang, Chen, Xinrong, Song, Zhijian
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10277510/
https://www.ncbi.nlm.nih.gov/pubmed/37342464
http://dx.doi.org/10.3389/fnins.2023.1200576
_version_ 1785060297914974208
author Wang, Jie
Zhang, Xinkang
Chen, Xinrong
Song, Zhijian
author_facet Wang, Jie
Zhang, Xinkang
Chen, Xinrong
Song, Zhijian
author_sort Wang, Jie
collection PubMed
description Robot-assisted minimally invasive surgery (RAMIS) has gained significant traction in clinical practice in recent years. However, most surgical robots rely on touch-based human-robot interaction (HRI), which increases the risk of bacterial diffusion. This risk is particularly concerning when surgeons must operate various equipment with their bare hands, necessitating repeated sterilization. Thus, achieving touch-free and precise manipulation with a surgical robot is challenging. To address this challenge, we propose a novel HRI interface based on gesture recognition, leveraging hand-keypoint regression and hand-shape reconstruction methods. By encoding the 21 keypoints from the recognized hand gesture, the robot can successfully perform the corresponding action according to predefined rules, which enables the robot to perform fine-tuning of surgical instruments without the need for physical contact with the surgeon. We evaluated the surgical applicability of the proposed system through both phantom and cadaver studies. In the phantom experiment, the average needle tip location error was 0.51  mm, and the mean angle error was 0.34 degrees. In the simulated nasopharyngeal carcinoma biopsy experiment, the needle insertion error was 0.16  mm, and the angle error was 0.10 degrees. These results indicate that the proposed system achieves clinically acceptable accuracy and can assist surgeons in performing contactless surgery with hand gesture interaction.
format Online
Article
Text
id pubmed-10277510
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-102775102023-06-20 A touch-free human-robot collaborative surgical navigation robotic system based on hand gesture recognition Wang, Jie Zhang, Xinkang Chen, Xinrong Song, Zhijian Front Neurosci Neuroscience Robot-assisted minimally invasive surgery (RAMIS) has gained significant traction in clinical practice in recent years. However, most surgical robots rely on touch-based human-robot interaction (HRI), which increases the risk of bacterial diffusion. This risk is particularly concerning when surgeons must operate various equipment with their bare hands, necessitating repeated sterilization. Thus, achieving touch-free and precise manipulation with a surgical robot is challenging. To address this challenge, we propose a novel HRI interface based on gesture recognition, leveraging hand-keypoint regression and hand-shape reconstruction methods. By encoding the 21 keypoints from the recognized hand gesture, the robot can successfully perform the corresponding action according to predefined rules, which enables the robot to perform fine-tuning of surgical instruments without the need for physical contact with the surgeon. We evaluated the surgical applicability of the proposed system through both phantom and cadaver studies. In the phantom experiment, the average needle tip location error was 0.51  mm, and the mean angle error was 0.34 degrees. In the simulated nasopharyngeal carcinoma biopsy experiment, the needle insertion error was 0.16  mm, and the angle error was 0.10 degrees. These results indicate that the proposed system achieves clinically acceptable accuracy and can assist surgeons in performing contactless surgery with hand gesture interaction. Frontiers Media S.A. 2023-06-05 /pmc/articles/PMC10277510/ /pubmed/37342464 http://dx.doi.org/10.3389/fnins.2023.1200576 Text en Copyright © 2023 Wang, Zhang, Chen and Song. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Wang, Jie
Zhang, Xinkang
Chen, Xinrong
Song, Zhijian
A touch-free human-robot collaborative surgical navigation robotic system based on hand gesture recognition
title A touch-free human-robot collaborative surgical navigation robotic system based on hand gesture recognition
title_full A touch-free human-robot collaborative surgical navigation robotic system based on hand gesture recognition
title_fullStr A touch-free human-robot collaborative surgical navigation robotic system based on hand gesture recognition
title_full_unstemmed A touch-free human-robot collaborative surgical navigation robotic system based on hand gesture recognition
title_short A touch-free human-robot collaborative surgical navigation robotic system based on hand gesture recognition
title_sort touch-free human-robot collaborative surgical navigation robotic system based on hand gesture recognition
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10277510/
https://www.ncbi.nlm.nih.gov/pubmed/37342464
http://dx.doi.org/10.3389/fnins.2023.1200576
work_keys_str_mv AT wangjie atouchfreehumanrobotcollaborativesurgicalnavigationroboticsystembasedonhandgesturerecognition
AT zhangxinkang atouchfreehumanrobotcollaborativesurgicalnavigationroboticsystembasedonhandgesturerecognition
AT chenxinrong atouchfreehumanrobotcollaborativesurgicalnavigationroboticsystembasedonhandgesturerecognition
AT songzhijian atouchfreehumanrobotcollaborativesurgicalnavigationroboticsystembasedonhandgesturerecognition
AT wangjie touchfreehumanrobotcollaborativesurgicalnavigationroboticsystembasedonhandgesturerecognition
AT zhangxinkang touchfreehumanrobotcollaborativesurgicalnavigationroboticsystembasedonhandgesturerecognition
AT chenxinrong touchfreehumanrobotcollaborativesurgicalnavigationroboticsystembasedonhandgesturerecognition
AT songzhijian touchfreehumanrobotcollaborativesurgicalnavigationroboticsystembasedonhandgesturerecognition