Cargando…

A multimodal user interface for touchless control of robotic ultrasound

PURPOSE: Past research contained the investigation and development of robotic ultrasound. In this context, interfaces which allow for interaction with the robotic system are of paramount importance. Few researchers have addressed the issue of developing non-tactile interaction approaches, although t...

Descripción completa

Detalles Bibliográficos
Autores principales: Schreiter, Josefine, Mielke, Tonia, Schott, Danny, Thormann, Maximilian, Omari, Jazan, Pech, Maciej, Hansen, Christian
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10363039/
https://www.ncbi.nlm.nih.gov/pubmed/36565368
http://dx.doi.org/10.1007/s11548-022-02810-0
_version_ 1785076553566126080
author Schreiter, Josefine
Mielke, Tonia
Schott, Danny
Thormann, Maximilian
Omari, Jazan
Pech, Maciej
Hansen, Christian
author_facet Schreiter, Josefine
Mielke, Tonia
Schott, Danny
Thormann, Maximilian
Omari, Jazan
Pech, Maciej
Hansen, Christian
author_sort Schreiter, Josefine
collection PubMed
description PURPOSE: Past research contained the investigation and development of robotic ultrasound. In this context, interfaces which allow for interaction with the robotic system are of paramount importance. Few researchers have addressed the issue of developing non-tactile interaction approaches, although they could be beneficial for maintaining sterility during medical procedures. Interaction could be supported by multimodality, which has the potential to enable intuitive and natural interaction. To assess the feasibility of multimodal interaction for non-tactile control of a co-located robotic ultrasound system, a novel human–robot interaction concept was developed. METHODS: The medical use case of needle-based interventions under hybrid computed tomography and ultrasound imaging was analyzed by interviewing four radiologists. From the resulting workflow, interaction tasks were derived which include human–robot interaction. Based on this, characteristics of a multimodal, touchless human–robot interface were elaborated, suitable interaction modalities were identified, and a corresponding interface was developed, which was thereafter evaluated in a user study with eight participants. RESULTS: The implemented interface includes voice commands, combined with hand gesture control for discrete control and navigation interaction of the robotic US probe, respectively. The interaction concept was evaluated by the users in the form of a quantitative questionnaire with a average usability. Qualitative analysis of interview results revealed user satisfaction with the implemented interaction methods and potential improvements to the system. CONCLUSION: A multimodal, touchless interaction concept for a robotic US for the use case of needle-based procedures in interventional radiology was developed, incorporating combined voice and hand gesture control. Future steps will include the integration of a solution for the missing haptic feedback and the evaluation of its clinical suitability. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s11548-022-02810-0.
format Online
Article
Text
id pubmed-10363039
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Springer International Publishing
record_format MEDLINE/PubMed
spelling pubmed-103630392023-07-24 A multimodal user interface for touchless control of robotic ultrasound Schreiter, Josefine Mielke, Tonia Schott, Danny Thormann, Maximilian Omari, Jazan Pech, Maciej Hansen, Christian Int J Comput Assist Radiol Surg Original Article PURPOSE: Past research contained the investigation and development of robotic ultrasound. In this context, interfaces which allow for interaction with the robotic system are of paramount importance. Few researchers have addressed the issue of developing non-tactile interaction approaches, although they could be beneficial for maintaining sterility during medical procedures. Interaction could be supported by multimodality, which has the potential to enable intuitive and natural interaction. To assess the feasibility of multimodal interaction for non-tactile control of a co-located robotic ultrasound system, a novel human–robot interaction concept was developed. METHODS: The medical use case of needle-based interventions under hybrid computed tomography and ultrasound imaging was analyzed by interviewing four radiologists. From the resulting workflow, interaction tasks were derived which include human–robot interaction. Based on this, characteristics of a multimodal, touchless human–robot interface were elaborated, suitable interaction modalities were identified, and a corresponding interface was developed, which was thereafter evaluated in a user study with eight participants. RESULTS: The implemented interface includes voice commands, combined with hand gesture control for discrete control and navigation interaction of the robotic US probe, respectively. The interaction concept was evaluated by the users in the form of a quantitative questionnaire with a average usability. Qualitative analysis of interview results revealed user satisfaction with the implemented interaction methods and potential improvements to the system. CONCLUSION: A multimodal, touchless interaction concept for a robotic US for the use case of needle-based procedures in interventional radiology was developed, incorporating combined voice and hand gesture control. Future steps will include the integration of a solution for the missing haptic feedback and the evaluation of its clinical suitability. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s11548-022-02810-0. Springer International Publishing 2022-12-24 2023 /pmc/articles/PMC10363039/ /pubmed/36565368 http://dx.doi.org/10.1007/s11548-022-02810-0 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Original Article
Schreiter, Josefine
Mielke, Tonia
Schott, Danny
Thormann, Maximilian
Omari, Jazan
Pech, Maciej
Hansen, Christian
A multimodal user interface for touchless control of robotic ultrasound
title A multimodal user interface for touchless control of robotic ultrasound
title_full A multimodal user interface for touchless control of robotic ultrasound
title_fullStr A multimodal user interface for touchless control of robotic ultrasound
title_full_unstemmed A multimodal user interface for touchless control of robotic ultrasound
title_short A multimodal user interface for touchless control of robotic ultrasound
title_sort multimodal user interface for touchless control of robotic ultrasound
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10363039/
https://www.ncbi.nlm.nih.gov/pubmed/36565368
http://dx.doi.org/10.1007/s11548-022-02810-0
work_keys_str_mv AT schreiterjosefine amultimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT mielketonia amultimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT schottdanny amultimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT thormannmaximilian amultimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT omarijazan amultimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT pechmaciej amultimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT hansenchristian amultimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT schreiterjosefine multimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT mielketonia multimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT schottdanny multimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT thormannmaximilian multimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT omarijazan multimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT pechmaciej multimodaluserinterfacefortouchlesscontrolofroboticultrasound
AT hansenchristian multimodaluserinterfacefortouchlesscontrolofroboticultrasound