Cargando…

Using parallel pre-trained types of DCNN model to predict breast cancer with color normalization

OBJECTIVE: Breast cancer is the most common among women, and it causes many deaths every year. Early diagnosis increases the chance of cure through treatment. The traditional manual diagnosis requires effort and time from pathological experts, as it needs a joint experience of a number of pathologis...

Descripción completa

Detalles Bibliográficos
Autores principales: Al Noumah, William, Jafar, Assef, Al Joumaa, Kadan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8751220/
https://www.ncbi.nlm.nih.gov/pubmed/35012681
http://dx.doi.org/10.1186/s13104-021-05902-3
Descripción
Sumario:OBJECTIVE: Breast cancer is the most common among women, and it causes many deaths every year. Early diagnosis increases the chance of cure through treatment. The traditional manual diagnosis requires effort and time from pathological experts, as it needs a joint experience of a number of pathologists. Diagnostic mistakes can lead to catastrophic results and endanger the lives of patients. The presence of an expert system that is able to specify whether the examined tissue is healthy or not, thus improves the quality of diagnosis and saves the time of experts. In this paper, a model capable of classifying breast cancer anatomy by making use of a pre-trained DCNN has been proposed. To build this model, first of all the image should be color stained by using Vahadane algorithm, then the model which combines three pre-trained DCNN (Xception, NASNet and Inceptoin_Resnet_V2) should be built in parallel, then the three branches should be aggregated to take advantage of each other. The suggested model was tested under different values of threshold ratios and also compared with other models. RESULTS: The proposed model on the BreaKHis dataset achieved 98% accuracy, which is better than the accuracy of other models used in this field. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s13104-021-05902-3.