Cargando…
Automated liver lesion detection in (68)Ga DOTATATE PET/CT using a deep fully convolutional neural network
BACKGROUND: Gastroenteropancreatic neuroendocrine tumors most commonly metastasize to the liver; however, high normal background (68)Ga-DOTATATE activity and high image noise make metastatic lesions difficult to detect. The purpose of this study is to develop a rapid, automated and highly specific m...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer Berlin Heidelberg
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8487415/ https://www.ncbi.nlm.nih.gov/pubmed/34601660 http://dx.doi.org/10.1186/s13550-021-00839-x |
_version_ | 1784577951490113536 |
---|---|
author | Wehrend, Jonathan Silosky, Michael Xing, Fuyong Chin, Bennett B. |
author_facet | Wehrend, Jonathan Silosky, Michael Xing, Fuyong Chin, Bennett B. |
author_sort | Wehrend, Jonathan |
collection | PubMed |
description | BACKGROUND: Gastroenteropancreatic neuroendocrine tumors most commonly metastasize to the liver; however, high normal background (68)Ga-DOTATATE activity and high image noise make metastatic lesions difficult to detect. The purpose of this study is to develop a rapid, automated and highly specific method to identify (68)Ga-DOTATATE PET/CT hepatic lesions using a 2D U-Net convolutional neural network. METHODS: A retrospective study of (68)Ga-DOTATATE PET/CT patient studies (n = 125; 57 with (68)Ga-DOTATATE hepatic lesions and 68 without) was evaluated. The dataset was randomly divided into 75 studies for the training set (36 abnormal, 39 normal), 25 for the validation set (11 abnormal, 14 normal) and 25 for the testing set (11 abnormal, 14 normal). Hepatic lesions were physician annotated using a modified PERCIST threshold, and boundary definition by gradient edge detection. The 2D U-Net was trained independently five times for 100,000 iterations using a linear combination of binary cross-entropy and dice losses with a stochastic gradient descent algorithm. Performance metrics included: positive predictive value (PPV), sensitivity, F(1) score and area under the precision–recall curve (PR-AUC). Five different pixel area thresholds were used to filter noisy predictions. RESULTS: A total of 233 lesions were annotated with each abnormal study containing a mean of 4 ± 2.75 lesions. A pixel filter of 20 produced the highest mean PPV 0.94 ± 0.01. A pixel filter of 5 produced the highest mean sensitivity 0.74 ± 0.02. The highest mean F(1) score 0.79 ± 0.01 was produced with a 20 pixel filter. The highest mean PR-AUC 0.73 ± 0.03 was produced with a 15 pixel filter. CONCLUSION: Deep neural networks can automatically detect hepatic lesions in (68)Ga-DOTATATE PET. Ongoing improvements in data annotation methods, increasing sample sizes and training methods are anticipated to further improve detection performance. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s13550-021-00839-x. |
format | Online Article Text |
id | pubmed-8487415 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Springer Berlin Heidelberg |
record_format | MEDLINE/PubMed |
spelling | pubmed-84874152021-10-04 Automated liver lesion detection in (68)Ga DOTATATE PET/CT using a deep fully convolutional neural network Wehrend, Jonathan Silosky, Michael Xing, Fuyong Chin, Bennett B. EJNMMI Res Original Research BACKGROUND: Gastroenteropancreatic neuroendocrine tumors most commonly metastasize to the liver; however, high normal background (68)Ga-DOTATATE activity and high image noise make metastatic lesions difficult to detect. The purpose of this study is to develop a rapid, automated and highly specific method to identify (68)Ga-DOTATATE PET/CT hepatic lesions using a 2D U-Net convolutional neural network. METHODS: A retrospective study of (68)Ga-DOTATATE PET/CT patient studies (n = 125; 57 with (68)Ga-DOTATATE hepatic lesions and 68 without) was evaluated. The dataset was randomly divided into 75 studies for the training set (36 abnormal, 39 normal), 25 for the validation set (11 abnormal, 14 normal) and 25 for the testing set (11 abnormal, 14 normal). Hepatic lesions were physician annotated using a modified PERCIST threshold, and boundary definition by gradient edge detection. The 2D U-Net was trained independently five times for 100,000 iterations using a linear combination of binary cross-entropy and dice losses with a stochastic gradient descent algorithm. Performance metrics included: positive predictive value (PPV), sensitivity, F(1) score and area under the precision–recall curve (PR-AUC). Five different pixel area thresholds were used to filter noisy predictions. RESULTS: A total of 233 lesions were annotated with each abnormal study containing a mean of 4 ± 2.75 lesions. A pixel filter of 20 produced the highest mean PPV 0.94 ± 0.01. A pixel filter of 5 produced the highest mean sensitivity 0.74 ± 0.02. The highest mean F(1) score 0.79 ± 0.01 was produced with a 20 pixel filter. The highest mean PR-AUC 0.73 ± 0.03 was produced with a 15 pixel filter. CONCLUSION: Deep neural networks can automatically detect hepatic lesions in (68)Ga-DOTATATE PET. Ongoing improvements in data annotation methods, increasing sample sizes and training methods are anticipated to further improve detection performance. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s13550-021-00839-x. Springer Berlin Heidelberg 2021-10-02 /pmc/articles/PMC8487415/ /pubmed/34601660 http://dx.doi.org/10.1186/s13550-021-00839-x Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Original Research Wehrend, Jonathan Silosky, Michael Xing, Fuyong Chin, Bennett B. Automated liver lesion detection in (68)Ga DOTATATE PET/CT using a deep fully convolutional neural network |
title | Automated liver lesion detection in (68)Ga DOTATATE PET/CT using a deep fully convolutional neural network |
title_full | Automated liver lesion detection in (68)Ga DOTATATE PET/CT using a deep fully convolutional neural network |
title_fullStr | Automated liver lesion detection in (68)Ga DOTATATE PET/CT using a deep fully convolutional neural network |
title_full_unstemmed | Automated liver lesion detection in (68)Ga DOTATATE PET/CT using a deep fully convolutional neural network |
title_short | Automated liver lesion detection in (68)Ga DOTATATE PET/CT using a deep fully convolutional neural network |
title_sort | automated liver lesion detection in (68)ga dotatate pet/ct using a deep fully convolutional neural network |
topic | Original Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8487415/ https://www.ncbi.nlm.nih.gov/pubmed/34601660 http://dx.doi.org/10.1186/s13550-021-00839-x |
work_keys_str_mv | AT wehrendjonathan automatedliverlesiondetectionin68gadotatatepetctusingadeepfullyconvolutionalneuralnetwork AT siloskymichael automatedliverlesiondetectionin68gadotatatepetctusingadeepfullyconvolutionalneuralnetwork AT xingfuyong automatedliverlesiondetectionin68gadotatatepetctusingadeepfullyconvolutionalneuralnetwork AT chinbennettb automatedliverlesiondetectionin68gadotatatepetctusingadeepfullyconvolutionalneuralnetwork |