Cargando…

Breast elastography: diagnostic performance of computer-aided diagnosis software and interobserver agreement

OBJECTIVE: To determine the best cutoff value for classifying breast masses by ultrasound elastography, using dedicated software for strain elastography, and to determine the level of interobserver agreement. MATERIALS AND METHODS: We enrolled 83 patients with 83 breast masses identified on ultrasou...

Descripción completa

Detalles Bibliográficos
Autores principales: Fleury, Eduardo F. C., Marcomini, Karem
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Colégio Brasileiro de Radiologia e Diagnóstico por Imagem 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7159052/
https://www.ncbi.nlm.nih.gov/pubmed/32313333
http://dx.doi.org/10.1590/0100-3984.2019.0035
_version_ 1783522599514406912
author Fleury, Eduardo F. C.
Marcomini, Karem
author_facet Fleury, Eduardo F. C.
Marcomini, Karem
author_sort Fleury, Eduardo F. C.
collection PubMed
description OBJECTIVE: To determine the best cutoff value for classifying breast masses by ultrasound elastography, using dedicated software for strain elastography, and to determine the level of interobserver agreement. MATERIALS AND METHODS: We enrolled 83 patients with 83 breast masses identified on ultrasound and referred for biopsy. After B-mode ultrasound examination, the lesions were manually segmented by three radiologists with varying degrees of experience in breast imaging, designated reader 1 (R1, with 15 years), reader 2 (R2, with 2 years), and reader 3 (R3, with 8 years). Elastography was performed automatically on the best image with computer-aided diagnosis (CAD) software. Cutoff values of 70%, 75%, 80%, and 90% of hard areas were applied for determining the performance of the CAD software. The best cutoff value for the most experienced radiologists was then compared with the visual assessment. Interobserver agreement for the best cutoff value was determined, as were the interclass correlation coefficient and concordance among the radiologists for the areas segmented. RESULTS: The best cutoff value of the proportion of hard area within a breast mass, for experienced radiologists, was found to be 75%. At a cutoff value of 75%, the interobserver agreement was excellent between R1 and R2, as well as between R1 and R3, and good between R2 and R3. The interclass concordance coefficient among the three radiologists was 0.950. When assessing the segmented areas by size, we found that the level of agreement was higher among the more experienced radiologists. CONCLUSION: The best cutoff value for a quantitative CAD system to classify breast masses was 75%.
format Online
Article
Text
id pubmed-7159052
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Colégio Brasileiro de Radiologia e Diagnóstico por Imagem
record_format MEDLINE/PubMed
spelling pubmed-71590522020-04-20 Breast elastography: diagnostic performance of computer-aided diagnosis software and interobserver agreement Fleury, Eduardo F. C. Marcomini, Karem Radiol Bras Original Articles OBJECTIVE: To determine the best cutoff value for classifying breast masses by ultrasound elastography, using dedicated software for strain elastography, and to determine the level of interobserver agreement. MATERIALS AND METHODS: We enrolled 83 patients with 83 breast masses identified on ultrasound and referred for biopsy. After B-mode ultrasound examination, the lesions were manually segmented by three radiologists with varying degrees of experience in breast imaging, designated reader 1 (R1, with 15 years), reader 2 (R2, with 2 years), and reader 3 (R3, with 8 years). Elastography was performed automatically on the best image with computer-aided diagnosis (CAD) software. Cutoff values of 70%, 75%, 80%, and 90% of hard areas were applied for determining the performance of the CAD software. The best cutoff value for the most experienced radiologists was then compared with the visual assessment. Interobserver agreement for the best cutoff value was determined, as were the interclass correlation coefficient and concordance among the radiologists for the areas segmented. RESULTS: The best cutoff value of the proportion of hard area within a breast mass, for experienced radiologists, was found to be 75%. At a cutoff value of 75%, the interobserver agreement was excellent between R1 and R2, as well as between R1 and R3, and good between R2 and R3. The interclass concordance coefficient among the three radiologists was 0.950. When assessing the segmented areas by size, we found that the level of agreement was higher among the more experienced radiologists. CONCLUSION: The best cutoff value for a quantitative CAD system to classify breast masses was 75%. Colégio Brasileiro de Radiologia e Diagnóstico por Imagem 2020 /pmc/articles/PMC7159052/ /pubmed/32313333 http://dx.doi.org/10.1590/0100-3984.2019.0035 Text en http://creativecommons.org/licenses/by/4.0/ This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Original Articles
Fleury, Eduardo F. C.
Marcomini, Karem
Breast elastography: diagnostic performance of computer-aided diagnosis software and interobserver agreement
title Breast elastography: diagnostic performance of computer-aided diagnosis software and interobserver agreement
title_full Breast elastography: diagnostic performance of computer-aided diagnosis software and interobserver agreement
title_fullStr Breast elastography: diagnostic performance of computer-aided diagnosis software and interobserver agreement
title_full_unstemmed Breast elastography: diagnostic performance of computer-aided diagnosis software and interobserver agreement
title_short Breast elastography: diagnostic performance of computer-aided diagnosis software and interobserver agreement
title_sort breast elastography: diagnostic performance of computer-aided diagnosis software and interobserver agreement
topic Original Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7159052/
https://www.ncbi.nlm.nih.gov/pubmed/32313333
http://dx.doi.org/10.1590/0100-3984.2019.0035
work_keys_str_mv AT fleuryeduardofc breastelastographydiagnosticperformanceofcomputeraideddiagnosissoftwareandinterobserveragreement
AT marcominikarem breastelastographydiagnosticperformanceofcomputeraideddiagnosissoftwareandinterobserveragreement