Cargando…

Multimodal ultrasound fusion network for differentiating between benign and malignant solid renal tumors

Objective: We aim to establish a deep learning model called multimodal ultrasound fusion network (MUF-Net) based on gray-scale and contrast-enhanced ultrasound (CEUS) images for classifying benign and malignant solid renal tumors automatically and to compare the model’s performance with the assessme...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhu, Dongmei, Li, Junyu, Li, Yan, Wu, Ji, Zhu, Lin, Li, Jian, Wang, Zimo, Xu, Jinfeng, Dong, Fajin, Cheng, Jun
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9488515/
https://www.ncbi.nlm.nih.gov/pubmed/36148014
http://dx.doi.org/10.3389/fmolb.2022.982703
Descripción
Sumario:Objective: We aim to establish a deep learning model called multimodal ultrasound fusion network (MUF-Net) based on gray-scale and contrast-enhanced ultrasound (CEUS) images for classifying benign and malignant solid renal tumors automatically and to compare the model’s performance with the assessments by radiologists with different levels of experience. Methods: A retrospective study included the CEUS videos of 181 patients with solid renal tumors (81 benign and 100 malignant tumors) from June 2012 to June 2021. A total of 9794 B-mode and CEUS-mode images were cropped from the CEUS videos. The MUF-Net was proposed to combine gray-scale and CEUS images to differentiate benign and malignant solid renal tumors. In this network, two independent branches were designed to extract features from each of the two modalities, and the features were fused using adaptive weights. Finally, the network output a classification score based on the fused features. The model’s performance was evaluated using five-fold cross-validation and compared with the assessments of the two groups of radiologists with different levels of experience. Results: For the discrimination between benign and malignant solid renal tumors, the junior radiologist group, senior radiologist group, and MUF-Net achieved accuracy of 70.6%, 75.7%, and 80.0%, sensitivity of 89.3%, 95.9%, and 80.4%, specificity of 58.7%, 62.9%, and 79.1%, and area under the receiver operating characteristic curve of 0.740 (95% confidence internal (CI): 0.70–0.75), 0.794 (95% CI: 0.72–0.83), and 0.877 (95% CI: 0.83–0.93), respectively. Conclusion: The MUF-Net model can accurately classify benign and malignant solid renal tumors and achieve better performance than senior radiologists. Key points: The CEUS video data contain the entire tumor microcirculation perfusion characteristics. The proposed MUF-Net based on B-mode and CEUS-mode images can accurately distinguish between benign and malignant solid renal tumors with an area under the receiver operating characteristic curve of 0.877, which surpasses senior radiologists’ assessments by a large margin.