Cargando…

Measuring digital pathology throughput and tissue dropouts

BACKGROUND: Digital pathology operations that precede viewing by a pathologist have a substantial impact on costs and fidelity of the digital image. Scan time and file size determine throughput and storage costs, whereas tissue omission during digital capture (“dropouts”) compromises downstream inte...

Descripción completa

Detalles Bibliográficos
Autores principales: Mutter, George L., Milstone, David S., Hwang, David H., Siegmund, Stephanie, Bruce, Alexander
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Elsevier 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8794031/
https://www.ncbi.nlm.nih.gov/pubmed/35136675
http://dx.doi.org/10.4103/jpi.jpi_5_21
_version_ 1784640737267154944
author Mutter, George L.
Milstone, David S.
Hwang, David H.
Siegmund, Stephanie
Bruce, Alexander
author_facet Mutter, George L.
Milstone, David S.
Hwang, David H.
Siegmund, Stephanie
Bruce, Alexander
author_sort Mutter, George L.
collection PubMed
description BACKGROUND: Digital pathology operations that precede viewing by a pathologist have a substantial impact on costs and fidelity of the digital image. Scan time and file size determine throughput and storage costs, whereas tissue omission during digital capture (“dropouts”) compromises downstream interpretation. We compared how these variables differ across scanners. METHODS: A 212 slide set randomly selected from a gynecologic-gestational pathology practice was used to benchmark scan time, file size, and image completeness. Workflows included the Hamamatsu S210 scanner (operated under default and optimized profiles) and the Leica GT450. Digital tissue dropouts were detected by the aligned overlay of macroscopic glass slide camera images (reference) with images created by the slide scanners whole slide images. RESULTS: File size and scan time were highly correlated within each platform. Differences in GT450, default S210, and optimized S210 performance were seen in average file size (1.4 vs. 2.5 vs. 3.4 GB) and scan time (93 vs. 376 vs. 721 s). Dropouts were seen in 29.5% (186/631) of successful scans overall: from a low of 13.7% (29/212) for the optimized S210 profile, followed by 34.6% (73/211) for the GT450 and 40.4% (84/208) for the default profile S210 profile. Small dislodged fragments, “shards,” were dropped in 22.2% (140/631) of slides, followed by tissue marginalized at the glass slide edges, 6.2% (39/631). “Unique dropouts,” those for which no equivalent appeared elsewhere in the scan, occurred in only three slides. Of these, 67% (2/3) were “floaters” or contaminants from other cases. CONCLUSIONS: Scanning speed and resultant file size vary greatly by scanner type, scanner operation settings, and clinical specimen mix (tissue type, tissue area). Digital image fidelity as measured by tissue dropout frequency and dropout type also varies according to the tissue type and scanner. Dropped tissues very rarely (1/631) represent actual specimen tissues that are not represented elsewhere in the scan, so in most cases cannot alter the diagnosis. Digital pathology platforms vary in their output efficiency and image fidelity to the glass original and should be matched to the intended application.
format Online
Article
Text
id pubmed-8794031
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Elsevier
record_format MEDLINE/PubMed
spelling pubmed-87940312022-02-07 Measuring digital pathology throughput and tissue dropouts Mutter, George L. Milstone, David S. Hwang, David H. Siegmund, Stephanie Bruce, Alexander J Pathol Inform Original Research Article BACKGROUND: Digital pathology operations that precede viewing by a pathologist have a substantial impact on costs and fidelity of the digital image. Scan time and file size determine throughput and storage costs, whereas tissue omission during digital capture (“dropouts”) compromises downstream interpretation. We compared how these variables differ across scanners. METHODS: A 212 slide set randomly selected from a gynecologic-gestational pathology practice was used to benchmark scan time, file size, and image completeness. Workflows included the Hamamatsu S210 scanner (operated under default and optimized profiles) and the Leica GT450. Digital tissue dropouts were detected by the aligned overlay of macroscopic glass slide camera images (reference) with images created by the slide scanners whole slide images. RESULTS: File size and scan time were highly correlated within each platform. Differences in GT450, default S210, and optimized S210 performance were seen in average file size (1.4 vs. 2.5 vs. 3.4 GB) and scan time (93 vs. 376 vs. 721 s). Dropouts were seen in 29.5% (186/631) of successful scans overall: from a low of 13.7% (29/212) for the optimized S210 profile, followed by 34.6% (73/211) for the GT450 and 40.4% (84/208) for the default profile S210 profile. Small dislodged fragments, “shards,” were dropped in 22.2% (140/631) of slides, followed by tissue marginalized at the glass slide edges, 6.2% (39/631). “Unique dropouts,” those for which no equivalent appeared elsewhere in the scan, occurred in only three slides. Of these, 67% (2/3) were “floaters” or contaminants from other cases. CONCLUSIONS: Scanning speed and resultant file size vary greatly by scanner type, scanner operation settings, and clinical specimen mix (tissue type, tissue area). Digital image fidelity as measured by tissue dropout frequency and dropout type also varies according to the tissue type and scanner. Dropped tissues very rarely (1/631) represent actual specimen tissues that are not represented elsewhere in the scan, so in most cases cannot alter the diagnosis. Digital pathology platforms vary in their output efficiency and image fidelity to the glass original and should be matched to the intended application. Elsevier 2022-12-20 /pmc/articles/PMC8794031/ /pubmed/35136675 http://dx.doi.org/10.4103/jpi.jpi_5_21 Text en © 2022 Published by Elsevier Inc. on behalf of Association for Pathology Informatics. https://creativecommons.org/licenses/by-nc-nd/4.0/This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
spellingShingle Original Research Article
Mutter, George L.
Milstone, David S.
Hwang, David H.
Siegmund, Stephanie
Bruce, Alexander
Measuring digital pathology throughput and tissue dropouts
title Measuring digital pathology throughput and tissue dropouts
title_full Measuring digital pathology throughput and tissue dropouts
title_fullStr Measuring digital pathology throughput and tissue dropouts
title_full_unstemmed Measuring digital pathology throughput and tissue dropouts
title_short Measuring digital pathology throughput and tissue dropouts
title_sort measuring digital pathology throughput and tissue dropouts
topic Original Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8794031/
https://www.ncbi.nlm.nih.gov/pubmed/35136675
http://dx.doi.org/10.4103/jpi.jpi_5_21
work_keys_str_mv AT muttergeorgel measuringdigitalpathologythroughputandtissuedropouts
AT milstonedavids measuringdigitalpathologythroughputandtissuedropouts
AT hwangdavidh measuringdigitalpathologythroughputandtissuedropouts
AT siegmundstephanie measuringdigitalpathologythroughputandtissuedropouts
AT brucealexander measuringdigitalpathologythroughputandtissuedropouts