Cargando…
Multi-modal wound classification using wound image and location by deep neural network
Wound classification is an essential step of wound diagnosis. An efficient classifier can assist wound specialists in classifying wound types with less financial and time costs and help them decide on an optimal treatment procedure. This study developed a deep neural network-based multi-modal classi...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9681740/ https://www.ncbi.nlm.nih.gov/pubmed/36414660 http://dx.doi.org/10.1038/s41598-022-21813-0 |
_version_ | 1784834688736559104 |
---|---|
author | Anisuzzaman, D. M. Patel, Yash Rostami, Behrouz Niezgoda, Jeffrey Gopalakrishnan, Sandeep Yu, Zeyun |
author_facet | Anisuzzaman, D. M. Patel, Yash Rostami, Behrouz Niezgoda, Jeffrey Gopalakrishnan, Sandeep Yu, Zeyun |
author_sort | Anisuzzaman, D. M. |
collection | PubMed |
description | Wound classification is an essential step of wound diagnosis. An efficient classifier can assist wound specialists in classifying wound types with less financial and time costs and help them decide on an optimal treatment procedure. This study developed a deep neural network-based multi-modal classifier using wound images and their corresponding locations to categorize them into multiple classes, including diabetic, pressure, surgical, and venous ulcers. A body map was also developed to prepare the location data, which can help wound specialists tag wound locations more efficiently. Three datasets containing images and their corresponding location information were designed with the help of wound specialists. The multi-modal network was developed by concatenating the image-based and location-based classifier outputs with other modifications. The maximum accuracy on mixed-class classifications (containing background and normal skin) varies from 82.48 to 100% in different experiments. The maximum accuracy on wound-class classifications (containing only diabetic, pressure, surgical, and venous) varies from 72.95 to 97.12% in various experiments. The proposed multi-modal network also showed a significant improvement in results from the previous works of literature. |
format | Online Article Text |
id | pubmed-9681740 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-96817402022-11-24 Multi-modal wound classification using wound image and location by deep neural network Anisuzzaman, D. M. Patel, Yash Rostami, Behrouz Niezgoda, Jeffrey Gopalakrishnan, Sandeep Yu, Zeyun Sci Rep Article Wound classification is an essential step of wound diagnosis. An efficient classifier can assist wound specialists in classifying wound types with less financial and time costs and help them decide on an optimal treatment procedure. This study developed a deep neural network-based multi-modal classifier using wound images and their corresponding locations to categorize them into multiple classes, including diabetic, pressure, surgical, and venous ulcers. A body map was also developed to prepare the location data, which can help wound specialists tag wound locations more efficiently. Three datasets containing images and their corresponding location information were designed with the help of wound specialists. The multi-modal network was developed by concatenating the image-based and location-based classifier outputs with other modifications. The maximum accuracy on mixed-class classifications (containing background and normal skin) varies from 82.48 to 100% in different experiments. The maximum accuracy on wound-class classifications (containing only diabetic, pressure, surgical, and venous) varies from 72.95 to 97.12% in various experiments. The proposed multi-modal network also showed a significant improvement in results from the previous works of literature. Nature Publishing Group UK 2022-11-21 /pmc/articles/PMC9681740/ /pubmed/36414660 http://dx.doi.org/10.1038/s41598-022-21813-0 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Anisuzzaman, D. M. Patel, Yash Rostami, Behrouz Niezgoda, Jeffrey Gopalakrishnan, Sandeep Yu, Zeyun Multi-modal wound classification using wound image and location by deep neural network |
title | Multi-modal wound classification using wound image and location by deep neural network |
title_full | Multi-modal wound classification using wound image and location by deep neural network |
title_fullStr | Multi-modal wound classification using wound image and location by deep neural network |
title_full_unstemmed | Multi-modal wound classification using wound image and location by deep neural network |
title_short | Multi-modal wound classification using wound image and location by deep neural network |
title_sort | multi-modal wound classification using wound image and location by deep neural network |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9681740/ https://www.ncbi.nlm.nih.gov/pubmed/36414660 http://dx.doi.org/10.1038/s41598-022-21813-0 |
work_keys_str_mv | AT anisuzzamandm multimodalwoundclassificationusingwoundimageandlocationbydeepneuralnetwork AT patelyash multimodalwoundclassificationusingwoundimageandlocationbydeepneuralnetwork AT rostamibehrouz multimodalwoundclassificationusingwoundimageandlocationbydeepneuralnetwork AT niezgodajeffrey multimodalwoundclassificationusingwoundimageandlocationbydeepneuralnetwork AT gopalakrishnansandeep multimodalwoundclassificationusingwoundimageandlocationbydeepneuralnetwork AT yuzeyun multimodalwoundclassificationusingwoundimageandlocationbydeepneuralnetwork |