Cargando…
Exploring the Interobserver Agreement in Computer-Aided Radiologic Tumor Measurement and Evaluation of Tumor Response
The accurate, objective, and reproducible evaluation of tumor response to therapy is indispensable in clinical trials. This study aimed at investigating the reliability and reproducibility of a computer-aided contouring (CAC) tool in tumor measurements and its impact on evaluation of tumor response...
Autores principales: | , , , , , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8841678/ https://www.ncbi.nlm.nih.gov/pubmed/35174064 http://dx.doi.org/10.3389/fonc.2021.691638 |
_version_ | 1784650888025997312 |
---|---|
author | Li, Hongsen Shen, Jiaying Shou, Jiawei Han, Weidong Gong, Liu Xu, Yiming Chen, Peng Wang, Kaixin Zhang, Shuangfeng Sun, Chao Zhang, Jie Niu, Zhongfeng Pan, Hongming Cai, Wenli Fang, Yong |
author_facet | Li, Hongsen Shen, Jiaying Shou, Jiawei Han, Weidong Gong, Liu Xu, Yiming Chen, Peng Wang, Kaixin Zhang, Shuangfeng Sun, Chao Zhang, Jie Niu, Zhongfeng Pan, Hongming Cai, Wenli Fang, Yong |
author_sort | Li, Hongsen |
collection | PubMed |
description | The accurate, objective, and reproducible evaluation of tumor response to therapy is indispensable in clinical trials. This study aimed at investigating the reliability and reproducibility of a computer-aided contouring (CAC) tool in tumor measurements and its impact on evaluation of tumor response in terms of RECIST 1.1 criteria. A total of 200 cancer patients were retrospectively collected in this study, which were randomly divided into two sets of 100 patients for experiential learning and testing. A total of 744 target lesions were identified by a senior radiologist in distinctive body parts, of which 278 lesions were in data set 1 (learning set) and 466 lesions were in data set 2 (testing set). Five image analysts were respectively instructed to measure lesion diameter using manual and CAC tools in data set 1 and subsequently tested in data set 2. The interobserver variability of tumor measurements was validated by using the coefficient of variance (CV), the Pearson correlation coefficient (PCC), and the interobserver correlation coefficient (ICC). We verified that the mean CV of manual measurement remained constant between the learning and testing data sets (0.33 vs. 0.32, p = 0.490), whereas it decreased for the CAC measurements after learning (0.24 vs. 0.19, p < 0.001). The interobserver measurements with good agreement (CV < 0.20) were 29.9% (manual) vs. 49.0% (CAC) in the learning set (p < 0.001) and 30.9% (manual) vs. 64.4% (CAC) in the testing set (p < 0.001). The mean PCCs were 0.56 ± 0.11 mm (manual) vs. 0.69 ± 0.10 mm (CAC) in the learning set (p = 0.013) and 0.73 ± 0.07 mm (manual) vs. 0.84 ± 0.03 mm (CAC) in the testing set (p < 0.001). ICCs were 0.633 (manual) vs. 0.698 (CAC) in the learning set (p < 0.001) and 0.716 (manual) vs. 0.824 (CAC) in the testing set (p < 0.001). The Fleiss’ kappa analysis revealed that the overall agreement was 58.7% (manual) vs. 58.9% (CAC) in the learning set and 62.9% (manual) vs. 74.5% (CAC) in the testing set. The 80% agreement of tumor response evaluation was 55.0% (manual) vs. 66.0% in the learning set and 60.6% (manual) vs. 79.7% (CAC) in the testing set. In conclusion, CAC can reduce the interobserver variability of radiological tumor measurements and thus improve the agreement of imaging evaluation of tumor response. |
format | Online Article Text |
id | pubmed-8841678 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-88416782022-02-15 Exploring the Interobserver Agreement in Computer-Aided Radiologic Tumor Measurement and Evaluation of Tumor Response Li, Hongsen Shen, Jiaying Shou, Jiawei Han, Weidong Gong, Liu Xu, Yiming Chen, Peng Wang, Kaixin Zhang, Shuangfeng Sun, Chao Zhang, Jie Niu, Zhongfeng Pan, Hongming Cai, Wenli Fang, Yong Front Oncol Oncology The accurate, objective, and reproducible evaluation of tumor response to therapy is indispensable in clinical trials. This study aimed at investigating the reliability and reproducibility of a computer-aided contouring (CAC) tool in tumor measurements and its impact on evaluation of tumor response in terms of RECIST 1.1 criteria. A total of 200 cancer patients were retrospectively collected in this study, which were randomly divided into two sets of 100 patients for experiential learning and testing. A total of 744 target lesions were identified by a senior radiologist in distinctive body parts, of which 278 lesions were in data set 1 (learning set) and 466 lesions were in data set 2 (testing set). Five image analysts were respectively instructed to measure lesion diameter using manual and CAC tools in data set 1 and subsequently tested in data set 2. The interobserver variability of tumor measurements was validated by using the coefficient of variance (CV), the Pearson correlation coefficient (PCC), and the interobserver correlation coefficient (ICC). We verified that the mean CV of manual measurement remained constant between the learning and testing data sets (0.33 vs. 0.32, p = 0.490), whereas it decreased for the CAC measurements after learning (0.24 vs. 0.19, p < 0.001). The interobserver measurements with good agreement (CV < 0.20) were 29.9% (manual) vs. 49.0% (CAC) in the learning set (p < 0.001) and 30.9% (manual) vs. 64.4% (CAC) in the testing set (p < 0.001). The mean PCCs were 0.56 ± 0.11 mm (manual) vs. 0.69 ± 0.10 mm (CAC) in the learning set (p = 0.013) and 0.73 ± 0.07 mm (manual) vs. 0.84 ± 0.03 mm (CAC) in the testing set (p < 0.001). ICCs were 0.633 (manual) vs. 0.698 (CAC) in the learning set (p < 0.001) and 0.716 (manual) vs. 0.824 (CAC) in the testing set (p < 0.001). The Fleiss’ kappa analysis revealed that the overall agreement was 58.7% (manual) vs. 58.9% (CAC) in the learning set and 62.9% (manual) vs. 74.5% (CAC) in the testing set. The 80% agreement of tumor response evaluation was 55.0% (manual) vs. 66.0% in the learning set and 60.6% (manual) vs. 79.7% (CAC) in the testing set. In conclusion, CAC can reduce the interobserver variability of radiological tumor measurements and thus improve the agreement of imaging evaluation of tumor response. Frontiers Media S.A. 2022-01-31 /pmc/articles/PMC8841678/ /pubmed/35174064 http://dx.doi.org/10.3389/fonc.2021.691638 Text en Copyright © 2022 Li, Shen, Shou, Han, Gong, Xu, Chen, Wang, Zhang, Sun, Zhang, Niu, Pan, Cai and Fang https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Oncology Li, Hongsen Shen, Jiaying Shou, Jiawei Han, Weidong Gong, Liu Xu, Yiming Chen, Peng Wang, Kaixin Zhang, Shuangfeng Sun, Chao Zhang, Jie Niu, Zhongfeng Pan, Hongming Cai, Wenli Fang, Yong Exploring the Interobserver Agreement in Computer-Aided Radiologic Tumor Measurement and Evaluation of Tumor Response |
title | Exploring the Interobserver Agreement in Computer-Aided Radiologic Tumor Measurement and Evaluation of Tumor Response |
title_full | Exploring the Interobserver Agreement in Computer-Aided Radiologic Tumor Measurement and Evaluation of Tumor Response |
title_fullStr | Exploring the Interobserver Agreement in Computer-Aided Radiologic Tumor Measurement and Evaluation of Tumor Response |
title_full_unstemmed | Exploring the Interobserver Agreement in Computer-Aided Radiologic Tumor Measurement and Evaluation of Tumor Response |
title_short | Exploring the Interobserver Agreement in Computer-Aided Radiologic Tumor Measurement and Evaluation of Tumor Response |
title_sort | exploring the interobserver agreement in computer-aided radiologic tumor measurement and evaluation of tumor response |
topic | Oncology |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8841678/ https://www.ncbi.nlm.nih.gov/pubmed/35174064 http://dx.doi.org/10.3389/fonc.2021.691638 |
work_keys_str_mv | AT lihongsen exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT shenjiaying exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT shoujiawei exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT hanweidong exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT gongliu exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT xuyiming exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT chenpeng exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT wangkaixin exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT zhangshuangfeng exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT sunchao exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT zhangjie exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT niuzhongfeng exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT panhongming exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT caiwenli exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse AT fangyong exploringtheinterobserveragreementincomputeraidedradiologictumormeasurementandevaluationoftumorresponse |