Cargando…
96.5 Using Computer Vision-Based Algorithms Trained on Mobile-Device Camera Images for Monitoring Burn Wound Healing
INTRODUCTION: The appropriate characterization of burn depth and healing is paramount. Unfortunately, the accuracy of approximating thermal injury depth among all physicians is poor. While tools to improve detection accuracy, including laser doppler imaging and laser speckle imaging exist, these tec...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Oxford University Press
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8945745/ http://dx.doi.org/10.1093/jbcr/irac012.099 |
_version_ | 1784674026770137088 |
---|---|
author | Chan, Hannah O Joshi, Rakesh Morzycki, Alexander Pun, Andrew C Wong, Joshua N Hong, Collins |
author_facet | Chan, Hannah O Joshi, Rakesh Morzycki, Alexander Pun, Andrew C Wong, Joshua N Hong, Collins |
author_sort | Chan, Hannah O |
collection | PubMed |
description | INTRODUCTION: The appropriate characterization of burn depth and healing is paramount. Unfortunately, the accuracy of approximating thermal injury depth among all physicians is poor. While tools to improve detection accuracy, including laser doppler imaging and laser speckle imaging exist, these technologies are expensive and limited to specialized burn referral centres. They also do not provide an easy means for quantitative, interval tracking of burn healing. Considering these limitations, the application of artificial intelligence has garnered significant interest. We herein present the use of three novel machine learning and computer vision-based algorithms to track burn wound healing. METHODS: Convolutional neural network (CNN) models, were trained on 1800 2D color burn images, to classify them into four burn severities. These CNNs were used to develop saliency algorithms that identify the highest “attention” pixels used to recognize burns. Image-based algorithms that count these attention pixels of the CNN, count pixels representing red granulation of burns, and measure burns, were also developed. As proof-of-concept, we tracked the healing of a localized burn on a 25-year-old female patient. The patient suffered a scald on the dorsum of the foot, resulting in a deep partial-thickness burn. Opting out of surgical intervention, the patient visited the hospital over a 6-week period for treatment with non-adhesive dressings and silver nitrate. High-resolution images of the burn, with and without a fiducial marker, were captured with a smartphone camera every 7-days. Images were taken under institutional lighting and used as algorithmic inputs. RESULTS: Data analyses indicate that the healing of the open-wound area was accurately measured in millimetres (+/- 1.7 mm error) using a fiducial marker (18.3 mm diameter). The open-wound area shrank consistently from week 1 to week 6 seen in (Figure 1. a-b). The normalized, 2D colour images, where the “red” pixel value was counted (Figure 1. a-b), confirms the reduction of the red granulation in the wound. The saliency algorithm also measured a percentage reduction in the machine learning model’s total attention pixels over the 6-week period (Figure 1. c-d). This suggests that the model was less discerning of the healing burn wound over time, suggesting burn healing, which was also clinically validated. |
format | Online Article Text |
id | pubmed-8945745 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Oxford University Press |
record_format | MEDLINE/PubMed |
spelling | pubmed-89457452022-03-28 96.5 Using Computer Vision-Based Algorithms Trained on Mobile-Device Camera Images for Monitoring Burn Wound Healing Chan, Hannah O Joshi, Rakesh Morzycki, Alexander Pun, Andrew C Wong, Joshua N Hong, Collins J Burn Care Res Correlative XII: Clinical Sciences Wounds & Scars 2 INTRODUCTION: The appropriate characterization of burn depth and healing is paramount. Unfortunately, the accuracy of approximating thermal injury depth among all physicians is poor. While tools to improve detection accuracy, including laser doppler imaging and laser speckle imaging exist, these technologies are expensive and limited to specialized burn referral centres. They also do not provide an easy means for quantitative, interval tracking of burn healing. Considering these limitations, the application of artificial intelligence has garnered significant interest. We herein present the use of three novel machine learning and computer vision-based algorithms to track burn wound healing. METHODS: Convolutional neural network (CNN) models, were trained on 1800 2D color burn images, to classify them into four burn severities. These CNNs were used to develop saliency algorithms that identify the highest “attention” pixels used to recognize burns. Image-based algorithms that count these attention pixels of the CNN, count pixels representing red granulation of burns, and measure burns, were also developed. As proof-of-concept, we tracked the healing of a localized burn on a 25-year-old female patient. The patient suffered a scald on the dorsum of the foot, resulting in a deep partial-thickness burn. Opting out of surgical intervention, the patient visited the hospital over a 6-week period for treatment with non-adhesive dressings and silver nitrate. High-resolution images of the burn, with and without a fiducial marker, were captured with a smartphone camera every 7-days. Images were taken under institutional lighting and used as algorithmic inputs. RESULTS: Data analyses indicate that the healing of the open-wound area was accurately measured in millimetres (+/- 1.7 mm error) using a fiducial marker (18.3 mm diameter). The open-wound area shrank consistently from week 1 to week 6 seen in (Figure 1. a-b). The normalized, 2D colour images, where the “red” pixel value was counted (Figure 1. a-b), confirms the reduction of the red granulation in the wound. The saliency algorithm also measured a percentage reduction in the machine learning model’s total attention pixels over the 6-week period (Figure 1. c-d). This suggests that the model was less discerning of the healing burn wound over time, suggesting burn healing, which was also clinically validated. Oxford University Press 2022-03-23 /pmc/articles/PMC8945745/ http://dx.doi.org/10.1093/jbcr/irac012.099 Text en © The Author(s) 2022. Published by Oxford University Press on behalf of the American Burn Association. https://creativecommons.org/licenses/by/4.0/This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Correlative XII: Clinical Sciences Wounds & Scars 2 Chan, Hannah O Joshi, Rakesh Morzycki, Alexander Pun, Andrew C Wong, Joshua N Hong, Collins 96.5 Using Computer Vision-Based Algorithms Trained on Mobile-Device Camera Images for Monitoring Burn Wound Healing |
title | 96.5 Using Computer Vision-Based Algorithms Trained on Mobile-Device Camera Images for Monitoring Burn Wound Healing |
title_full | 96.5 Using Computer Vision-Based Algorithms Trained on Mobile-Device Camera Images for Monitoring Burn Wound Healing |
title_fullStr | 96.5 Using Computer Vision-Based Algorithms Trained on Mobile-Device Camera Images for Monitoring Burn Wound Healing |
title_full_unstemmed | 96.5 Using Computer Vision-Based Algorithms Trained on Mobile-Device Camera Images for Monitoring Burn Wound Healing |
title_short | 96.5 Using Computer Vision-Based Algorithms Trained on Mobile-Device Camera Images for Monitoring Burn Wound Healing |
title_sort | 96.5 using computer vision-based algorithms trained on mobile-device camera images for monitoring burn wound healing |
topic | Correlative XII: Clinical Sciences Wounds & Scars 2 |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8945745/ http://dx.doi.org/10.1093/jbcr/irac012.099 |
work_keys_str_mv | AT chanhannaho 965usingcomputervisionbasedalgorithmstrainedonmobiledevicecameraimagesformonitoringburnwoundhealing AT joshirakesh 965usingcomputervisionbasedalgorithmstrainedonmobiledevicecameraimagesformonitoringburnwoundhealing AT morzyckialexander 965usingcomputervisionbasedalgorithmstrainedonmobiledevicecameraimagesformonitoringburnwoundhealing AT punandrewc 965usingcomputervisionbasedalgorithmstrainedonmobiledevicecameraimagesformonitoringburnwoundhealing AT wongjoshuan 965usingcomputervisionbasedalgorithmstrainedonmobiledevicecameraimagesformonitoringburnwoundhealing AT hongcollins 965usingcomputervisionbasedalgorithmstrainedonmobiledevicecameraimagesformonitoringburnwoundhealing |