Cargando…
Comparative Assessment of Digital Pathology Systems for Primary Diagnosis
BACKGROUND: Despite increasing interest in whole-slide imaging (WSI) over optical microscopy (OM), limited information on comparative assessment of various digital pathology systems (DPSs) is available. MATERIALS AND METHODS: A comprehensive evaluation was undertaken to investigate the technical per...
Autores principales: | , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Wolters Kluwer - Medknow
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8356707/ https://www.ncbi.nlm.nih.gov/pubmed/34447605 http://dx.doi.org/10.4103/jpi.jpi_94_20 |
_version_ | 1783736997454544896 |
---|---|
author | Rajaganesan, Sathyanarayanan Kumar, Rajiv Rao, Vidya Pai, Trupti Mittal, Neha Sahay, Ayushi Menon, Santosh Desai, Sangeeta |
author_facet | Rajaganesan, Sathyanarayanan Kumar, Rajiv Rao, Vidya Pai, Trupti Mittal, Neha Sahay, Ayushi Menon, Santosh Desai, Sangeeta |
author_sort | Rajaganesan, Sathyanarayanan |
collection | PubMed |
description | BACKGROUND: Despite increasing interest in whole-slide imaging (WSI) over optical microscopy (OM), limited information on comparative assessment of various digital pathology systems (DPSs) is available. MATERIALS AND METHODS: A comprehensive evaluation was undertaken to investigate the technical performance–assessment and diagnostic accuracy of four DPSs with an objective to establish the noninferiority of WSI over OM and find out the best possible DPS for clinical workflow. RESULTS: A total of 2376 digital images, 15,775 image reads (OM - 3171 + WSI - 12,404), and 6100 diagnostic reads (OM - 1245, WSI - 4855) were generated across four DPSs (coded as DPS: 1, 2, 3, and 4) using a total 240 cases (604 slides). Onsite technical evaluation revealed successful scan rate: DPS3 < DPS2 < DPS4 < DPS1; mean scanning time: DPS4 < DPS1 < DPS2 < DPS3; and average storage space: DPS3 < DPS2 < DPS1 < DPS4. Overall diagnostic accuracy, when compared with the reference standard for OM and WSI, was 95.44% (including 2.48% minor and 2.08% major discordances) and 93.32% (including 4.28% minor and 2.4% major discordances), respectively. The difference between the clinically significant discordances by WSI versus OM was 0.32%. Major discordances were observed mostly using DPS4 and least in DPS1; however, the difference was statistically insignificant. Almost perfect (κ ≥ 0.8)/substantial (κ = 0.6–0.8) inter/intra-observer agreement between WSI and OM was observed for all specimen types, except cytology. Overall image quality was best for DPS1 followed by DPS4. Mean digital artifact rate was 6.8% (163/2376 digital images) and maximum artifacts were noted in DPS2 (n = 77) followed by DPS3 (n = 36). Most pathologists preferred viewing software of DPS1 and DPS2. CONCLUSION: WSI was noninferior to OM for all specimen types, except for cytology. Each DPS has its own pros and cons; however, DPS1 closely emulated the real-world clinical environment. This evaluation is intended to provide a roadmap to pathologists for the selection of the appropriate DPSs while adopting WSI. |
format | Online Article Text |
id | pubmed-8356707 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Wolters Kluwer - Medknow |
record_format | MEDLINE/PubMed |
spelling | pubmed-83567072021-08-25 Comparative Assessment of Digital Pathology Systems for Primary Diagnosis Rajaganesan, Sathyanarayanan Kumar, Rajiv Rao, Vidya Pai, Trupti Mittal, Neha Sahay, Ayushi Menon, Santosh Desai, Sangeeta J Pathol Inform Original Article BACKGROUND: Despite increasing interest in whole-slide imaging (WSI) over optical microscopy (OM), limited information on comparative assessment of various digital pathology systems (DPSs) is available. MATERIALS AND METHODS: A comprehensive evaluation was undertaken to investigate the technical performance–assessment and diagnostic accuracy of four DPSs with an objective to establish the noninferiority of WSI over OM and find out the best possible DPS for clinical workflow. RESULTS: A total of 2376 digital images, 15,775 image reads (OM - 3171 + WSI - 12,404), and 6100 diagnostic reads (OM - 1245, WSI - 4855) were generated across four DPSs (coded as DPS: 1, 2, 3, and 4) using a total 240 cases (604 slides). Onsite technical evaluation revealed successful scan rate: DPS3 < DPS2 < DPS4 < DPS1; mean scanning time: DPS4 < DPS1 < DPS2 < DPS3; and average storage space: DPS3 < DPS2 < DPS1 < DPS4. Overall diagnostic accuracy, when compared with the reference standard for OM and WSI, was 95.44% (including 2.48% minor and 2.08% major discordances) and 93.32% (including 4.28% minor and 2.4% major discordances), respectively. The difference between the clinically significant discordances by WSI versus OM was 0.32%. Major discordances were observed mostly using DPS4 and least in DPS1; however, the difference was statistically insignificant. Almost perfect (κ ≥ 0.8)/substantial (κ = 0.6–0.8) inter/intra-observer agreement between WSI and OM was observed for all specimen types, except cytology. Overall image quality was best for DPS1 followed by DPS4. Mean digital artifact rate was 6.8% (163/2376 digital images) and maximum artifacts were noted in DPS2 (n = 77) followed by DPS3 (n = 36). Most pathologists preferred viewing software of DPS1 and DPS2. CONCLUSION: WSI was noninferior to OM for all specimen types, except for cytology. Each DPS has its own pros and cons; however, DPS1 closely emulated the real-world clinical environment. This evaluation is intended to provide a roadmap to pathologists for the selection of the appropriate DPSs while adopting WSI. Wolters Kluwer - Medknow 2021-06-09 /pmc/articles/PMC8356707/ /pubmed/34447605 http://dx.doi.org/10.4103/jpi.jpi_94_20 Text en Copyright: © 2021 Journal of Pathology Informatics https://creativecommons.org/licenses/by-nc-sa/4.0/This is an open access journal, and articles are distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 License, which allows others to remix, tweak, and build upon the work non-commercially, as long as appropriate credit is given and the new creations are licensed under the identical terms. |
spellingShingle | Original Article Rajaganesan, Sathyanarayanan Kumar, Rajiv Rao, Vidya Pai, Trupti Mittal, Neha Sahay, Ayushi Menon, Santosh Desai, Sangeeta Comparative Assessment of Digital Pathology Systems for Primary Diagnosis |
title | Comparative Assessment of Digital Pathology Systems for Primary Diagnosis |
title_full | Comparative Assessment of Digital Pathology Systems for Primary Diagnosis |
title_fullStr | Comparative Assessment of Digital Pathology Systems for Primary Diagnosis |
title_full_unstemmed | Comparative Assessment of Digital Pathology Systems for Primary Diagnosis |
title_short | Comparative Assessment of Digital Pathology Systems for Primary Diagnosis |
title_sort | comparative assessment of digital pathology systems for primary diagnosis |
topic | Original Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8356707/ https://www.ncbi.nlm.nih.gov/pubmed/34447605 http://dx.doi.org/10.4103/jpi.jpi_94_20 |
work_keys_str_mv | AT rajaganesansathyanarayanan comparativeassessmentofdigitalpathologysystemsforprimarydiagnosis AT kumarrajiv comparativeassessmentofdigitalpathologysystemsforprimarydiagnosis AT raovidya comparativeassessmentofdigitalpathologysystemsforprimarydiagnosis AT paitrupti comparativeassessmentofdigitalpathologysystemsforprimarydiagnosis AT mittalneha comparativeassessmentofdigitalpathologysystemsforprimarydiagnosis AT sahayayushi comparativeassessmentofdigitalpathologysystemsforprimarydiagnosis AT menonsantosh comparativeassessmentofdigitalpathologysystemsforprimarydiagnosis AT desaisangeeta comparativeassessmentofdigitalpathologysystemsforprimarydiagnosis |