Cargando…

Seeing under the cover with a 3D U-Net: point cloud-based weight estimation of covered patients

PURPOSE: Body weight is a crucial parameter for patient-specific treatments, particularly in the context of proper drug dosage. Contactless weight estimation from visual sensor data constitutes a promising approach to overcome challenges arising in emergency situations. Machine learning-based method...

Descripción completa

Detalles Bibliográficos
Autores principales: Bigalke, Alexander, Hansen, Lasse, Diesel, Jasper, Heinrich, Mattias P.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8616862/
https://www.ncbi.nlm.nih.gov/pubmed/34420184
http://dx.doi.org/10.1007/s11548-021-02476-0
_version_ 1784604420514775040
author Bigalke, Alexander
Hansen, Lasse
Diesel, Jasper
Heinrich, Mattias P.
author_facet Bigalke, Alexander
Hansen, Lasse
Diesel, Jasper
Heinrich, Mattias P.
author_sort Bigalke, Alexander
collection PubMed
description PURPOSE: Body weight is a crucial parameter for patient-specific treatments, particularly in the context of proper drug dosage. Contactless weight estimation from visual sensor data constitutes a promising approach to overcome challenges arising in emergency situations. Machine learning-based methods have recently been shown to perform accurate weight estimation from point cloud data. The proposed methods, however, are designed for controlled conditions in terms of visibility and position of the patient, which limits their practical applicability. In this work, we aim to decouple accurate weight estimation from such specific conditions by predicting the weight of covered patients from voxelized point cloud data. METHODS: We propose a novel deep learning framework, which comprises two 3D CNN modules solving the given task in two separate steps. First, we train a 3D U-Net to virtually uncover the patient, i.e. to predict the patient’s volumetric surface without a cover. Second, the patient’s weight is predicted from this 3D volume by means of a 3D CNN architecture, which we optimized for weight regression. RESULTS: We evaluate our approach on a lying pose dataset (SLP) under two different cover conditions. The proposed framework considerably improves on the baseline model by up to [Formula: see text] and reduces the gap between the accuracy of weight estimates for covered and uncovered patients by up to [Formula: see text] . CONCLUSION: We present a novel pipeline to estimate the weight of patients, which are covered by a blanket. Our approach relaxes the specific conditions that were required for accurate weight estimates by previous contactless methods and thus constitutes an important step towards fully automatic weight estimation in clinical practice.
format Online
Article
Text
id pubmed-8616862
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Springer International Publishing
record_format MEDLINE/PubMed
spelling pubmed-86168622021-12-01 Seeing under the cover with a 3D U-Net: point cloud-based weight estimation of covered patients Bigalke, Alexander Hansen, Lasse Diesel, Jasper Heinrich, Mattias P. Int J Comput Assist Radiol Surg Original Article PURPOSE: Body weight is a crucial parameter for patient-specific treatments, particularly in the context of proper drug dosage. Contactless weight estimation from visual sensor data constitutes a promising approach to overcome challenges arising in emergency situations. Machine learning-based methods have recently been shown to perform accurate weight estimation from point cloud data. The proposed methods, however, are designed for controlled conditions in terms of visibility and position of the patient, which limits their practical applicability. In this work, we aim to decouple accurate weight estimation from such specific conditions by predicting the weight of covered patients from voxelized point cloud data. METHODS: We propose a novel deep learning framework, which comprises two 3D CNN modules solving the given task in two separate steps. First, we train a 3D U-Net to virtually uncover the patient, i.e. to predict the patient’s volumetric surface without a cover. Second, the patient’s weight is predicted from this 3D volume by means of a 3D CNN architecture, which we optimized for weight regression. RESULTS: We evaluate our approach on a lying pose dataset (SLP) under two different cover conditions. The proposed framework considerably improves on the baseline model by up to [Formula: see text] and reduces the gap between the accuracy of weight estimates for covered and uncovered patients by up to [Formula: see text] . CONCLUSION: We present a novel pipeline to estimate the weight of patients, which are covered by a blanket. Our approach relaxes the specific conditions that were required for accurate weight estimates by previous contactless methods and thus constitutes an important step towards fully automatic weight estimation in clinical practice. Springer International Publishing 2021-08-21 2021 /pmc/articles/PMC8616862/ /pubmed/34420184 http://dx.doi.org/10.1007/s11548-021-02476-0 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Original Article
Bigalke, Alexander
Hansen, Lasse
Diesel, Jasper
Heinrich, Mattias P.
Seeing under the cover with a 3D U-Net: point cloud-based weight estimation of covered patients
title Seeing under the cover with a 3D U-Net: point cloud-based weight estimation of covered patients
title_full Seeing under the cover with a 3D U-Net: point cloud-based weight estimation of covered patients
title_fullStr Seeing under the cover with a 3D U-Net: point cloud-based weight estimation of covered patients
title_full_unstemmed Seeing under the cover with a 3D U-Net: point cloud-based weight estimation of covered patients
title_short Seeing under the cover with a 3D U-Net: point cloud-based weight estimation of covered patients
title_sort seeing under the cover with a 3d u-net: point cloud-based weight estimation of covered patients
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8616862/
https://www.ncbi.nlm.nih.gov/pubmed/34420184
http://dx.doi.org/10.1007/s11548-021-02476-0
work_keys_str_mv AT bigalkealexander seeingunderthecoverwitha3dunetpointcloudbasedweightestimationofcoveredpatients
AT hansenlasse seeingunderthecoverwitha3dunetpointcloudbasedweightestimationofcoveredpatients
AT dieseljasper seeingunderthecoverwitha3dunetpointcloudbasedweightestimationofcoveredpatients
AT heinrichmattiasp seeingunderthecoverwitha3dunetpointcloudbasedweightestimationofcoveredpatients