Cargando…
COVID-19 disease identification from chest CT images using empirical wavelet transformation and transfer learning
In the current scenario, novel coronavirus disease (COVID-19) spread is increasing day-by-day. It is very important to control and cure this disease. Reverse transcription-polymerase chain reaction (RT-PCR), chest computerized tomography (CT) imaging options are available as a significantly useful a...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Published by Elsevier Ltd.
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8384584/ https://www.ncbi.nlm.nih.gov/pubmed/34457034 http://dx.doi.org/10.1016/j.bspc.2021.103076 |
Sumario: | In the current scenario, novel coronavirus disease (COVID-19) spread is increasing day-by-day. It is very important to control and cure this disease. Reverse transcription-polymerase chain reaction (RT-PCR), chest computerized tomography (CT) imaging options are available as a significantly useful and more truthful tool to classify COVID-19 within the epidemic region. Most of the hospitals have CT imaging machines. It will be fruitful to utilize the chest CT images for early diagnosis and classification of COVID-19 patients. This requires a radiology expert and a good amount of time to classify the chest CT-based COVID-19 images especially when the disease is spreading at a rapid rate. During this pandemic COVID-19, there is a need for an efficient automated way to check for infection. CT is one of the best ways to detect infection inpatients. This paper introduces a new method for preprocessing and classifying COVID-19 positive and negative from CT scan images. The method which is being proposed uses the concept of empirical wavelet transformation for preprocessing, selecting the best components of the red, green, and blue channels of the image are trained on the proposed network. With the proposed methodology, the classification accuracy of 85.5%, F1 score of 85.28%, and AUC of 96.6% are achieved. |
---|