Cargando…

Image-based recognition of surgical instruments by means of convolutional neural networks

PURPOSE: This work presents a novel camera-based approach for the visual recognition of surgical instruments. In contrast to the state of the art, the presented approach works without any additional markers. The recognition is the first step for the implementation of tracking and tracing of instrume...

Descripción completa

Detalles Bibliográficos
Autores principales: Lehr, Jan, Kelterborn, Kathrin, Briese, Clemens, Schlueter, Marian, Kroeger, Ole, Krueger, Joerg
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10589183/
https://www.ncbi.nlm.nih.gov/pubmed/37199826
http://dx.doi.org/10.1007/s11548-023-02885-3
_version_ 1785123734224371712
author Lehr, Jan
Kelterborn, Kathrin
Briese, Clemens
Schlueter, Marian
Kroeger, Ole
Krueger, Joerg
author_facet Lehr, Jan
Kelterborn, Kathrin
Briese, Clemens
Schlueter, Marian
Kroeger, Ole
Krueger, Joerg
author_sort Lehr, Jan
collection PubMed
description PURPOSE: This work presents a novel camera-based approach for the visual recognition of surgical instruments. In contrast to the state of the art, the presented approach works without any additional markers. The recognition is the first step for the implementation of tracking and tracing of instruments wherever they are visible and could be seen by camera systems. Recognition takes place at item number level. Surgical instruments that share the same article number also share the same functions. A distinction at this level of detail is sufficient for most clinical applications. METHODS: In this work, an image-based data set with over 6500 images is generated from 156 different surgical instruments. Forty-two images were acquired from each surgical instrument. The largest part is used to train convolutional neural networks (CNNs). The CNN is used as a classifier, where each class corresponds to an article number of the surgical instruments used. Only one surgical instrument exists per article number in the data set. RESULTS: With a suitable amount of validation and test data, different CNN approaches are evaluated. The results show a recognition accuracy of up to 99.9% for the test data. To achieve these accuracies, an EfficientNet-B7 was used. It was also pre-trained on the ImageNet data set and then fine-tuned on the given data. This means that no weights were frozen during the training, but all layers were trained. CONCLUSION: With recognition accuracies of up to 99.9% on a highly meaningful test data set, recognition of surgical instruments is suitable for many track and trace applications in the hospital. But the system has limitations: A homogeneous background and controlled lighting conditions are required. The detection of multiple instruments in one image in front of various backgrounds is part of future work.
format Online
Article
Text
id pubmed-10589183
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Springer International Publishing
record_format MEDLINE/PubMed
spelling pubmed-105891832023-10-22 Image-based recognition of surgical instruments by means of convolutional neural networks Lehr, Jan Kelterborn, Kathrin Briese, Clemens Schlueter, Marian Kroeger, Ole Krueger, Joerg Int J Comput Assist Radiol Surg Review Article PURPOSE: This work presents a novel camera-based approach for the visual recognition of surgical instruments. In contrast to the state of the art, the presented approach works without any additional markers. The recognition is the first step for the implementation of tracking and tracing of instruments wherever they are visible and could be seen by camera systems. Recognition takes place at item number level. Surgical instruments that share the same article number also share the same functions. A distinction at this level of detail is sufficient for most clinical applications. METHODS: In this work, an image-based data set with over 6500 images is generated from 156 different surgical instruments. Forty-two images were acquired from each surgical instrument. The largest part is used to train convolutional neural networks (CNNs). The CNN is used as a classifier, where each class corresponds to an article number of the surgical instruments used. Only one surgical instrument exists per article number in the data set. RESULTS: With a suitable amount of validation and test data, different CNN approaches are evaluated. The results show a recognition accuracy of up to 99.9% for the test data. To achieve these accuracies, an EfficientNet-B7 was used. It was also pre-trained on the ImageNet data set and then fine-tuned on the given data. This means that no weights were frozen during the training, but all layers were trained. CONCLUSION: With recognition accuracies of up to 99.9% on a highly meaningful test data set, recognition of surgical instruments is suitable for many track and trace applications in the hospital. But the system has limitations: A homogeneous background and controlled lighting conditions are required. The detection of multiple instruments in one image in front of various backgrounds is part of future work. Springer International Publishing 2023-05-18 2023 /pmc/articles/PMC10589183/ /pubmed/37199826 http://dx.doi.org/10.1007/s11548-023-02885-3 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Review Article
Lehr, Jan
Kelterborn, Kathrin
Briese, Clemens
Schlueter, Marian
Kroeger, Ole
Krueger, Joerg
Image-based recognition of surgical instruments by means of convolutional neural networks
title Image-based recognition of surgical instruments by means of convolutional neural networks
title_full Image-based recognition of surgical instruments by means of convolutional neural networks
title_fullStr Image-based recognition of surgical instruments by means of convolutional neural networks
title_full_unstemmed Image-based recognition of surgical instruments by means of convolutional neural networks
title_short Image-based recognition of surgical instruments by means of convolutional neural networks
title_sort image-based recognition of surgical instruments by means of convolutional neural networks
topic Review Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10589183/
https://www.ncbi.nlm.nih.gov/pubmed/37199826
http://dx.doi.org/10.1007/s11548-023-02885-3
work_keys_str_mv AT lehrjan imagebasedrecognitionofsurgicalinstrumentsbymeansofconvolutionalneuralnetworks
AT kelterbornkathrin imagebasedrecognitionofsurgicalinstrumentsbymeansofconvolutionalneuralnetworks
AT brieseclemens imagebasedrecognitionofsurgicalinstrumentsbymeansofconvolutionalneuralnetworks
AT schluetermarian imagebasedrecognitionofsurgicalinstrumentsbymeansofconvolutionalneuralnetworks
AT kroegerole imagebasedrecognitionofsurgicalinstrumentsbymeansofconvolutionalneuralnetworks
AT kruegerjoerg imagebasedrecognitionofsurgicalinstrumentsbymeansofconvolutionalneuralnetworks