Cargando…
State-of-the-Art Capability of Convolutional Neural Networks to Distinguish the Signal in the Ionosphere
Recovering and distinguishing different ionospheric layers and signals usually requires slow and complicated procedures. In this work, we construct and train five convolutional neural network (CNN) models: DeepLab, fully convolutional DenseNet24 (FC-DenseNet24), deep watershed transform (DWT), Mask...
Autores principales: | , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9002747/ https://www.ncbi.nlm.nih.gov/pubmed/35408372 http://dx.doi.org/10.3390/s22072758 |
_version_ | 1784685964022513664 |
---|---|
author | Chang, Yu-Chi Lin, Chia-Hsien Dmitriev, Alexei V. Hsieh, Mon-Chai Hsu, Hao-Wei Lin, Yu-Ciang Mendoza, Merlin M. Huang, Guan-Han Tsai, Lung-Chih Li, Yung-Hui Tsogtbaatar, Enkhtuya |
author_facet | Chang, Yu-Chi Lin, Chia-Hsien Dmitriev, Alexei V. Hsieh, Mon-Chai Hsu, Hao-Wei Lin, Yu-Ciang Mendoza, Merlin M. Huang, Guan-Han Tsai, Lung-Chih Li, Yung-Hui Tsogtbaatar, Enkhtuya |
author_sort | Chang, Yu-Chi |
collection | PubMed |
description | Recovering and distinguishing different ionospheric layers and signals usually requires slow and complicated procedures. In this work, we construct and train five convolutional neural network (CNN) models: DeepLab, fully convolutional DenseNet24 (FC-DenseNet24), deep watershed transform (DWT), Mask R-CNN, and spatial attention-UNet (SA-UNet) for the recovery of ionograms. The performance of the models is evaluated by intersection over union (IoU). We collect and manually label 6131 ionograms, which are acquired from a low-latitude ionosonde in Taiwan. These ionograms are contaminated by strong quasi-static noise, with an average signal-to-noise ratio (SNR) equal to 1.4. Applying the five models to these noisy ionograms, we show that the models can recover useful signals with IoU > 0.6. The highest accuracy is achieved by SA-UNet. For signals with less than 15% of samples in the data set, they can be recovered by Mask R-CNN to some degree (IoU > 0.2). In addition to the number of samples, we identify and examine the effects of three factors: (1) SNR, (2) shape of signal, (3) overlapping of signals on the recovery accuracy of different models. Our results indicate that FC-DenseNet24, DWT, Mask R-CNN and SA-UNet are capable of identifying signals from very noisy ionograms (SNR < 1.4), overlapping signals can be well identified by DWT, Mask R-CNN and SA-UNet, and that more elongated signals are better identified by all models. |
format | Online Article Text |
id | pubmed-9002747 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-90027472022-04-13 State-of-the-Art Capability of Convolutional Neural Networks to Distinguish the Signal in the Ionosphere Chang, Yu-Chi Lin, Chia-Hsien Dmitriev, Alexei V. Hsieh, Mon-Chai Hsu, Hao-Wei Lin, Yu-Ciang Mendoza, Merlin M. Huang, Guan-Han Tsai, Lung-Chih Li, Yung-Hui Tsogtbaatar, Enkhtuya Sensors (Basel) Article Recovering and distinguishing different ionospheric layers and signals usually requires slow and complicated procedures. In this work, we construct and train five convolutional neural network (CNN) models: DeepLab, fully convolutional DenseNet24 (FC-DenseNet24), deep watershed transform (DWT), Mask R-CNN, and spatial attention-UNet (SA-UNet) for the recovery of ionograms. The performance of the models is evaluated by intersection over union (IoU). We collect and manually label 6131 ionograms, which are acquired from a low-latitude ionosonde in Taiwan. These ionograms are contaminated by strong quasi-static noise, with an average signal-to-noise ratio (SNR) equal to 1.4. Applying the five models to these noisy ionograms, we show that the models can recover useful signals with IoU > 0.6. The highest accuracy is achieved by SA-UNet. For signals with less than 15% of samples in the data set, they can be recovered by Mask R-CNN to some degree (IoU > 0.2). In addition to the number of samples, we identify and examine the effects of three factors: (1) SNR, (2) shape of signal, (3) overlapping of signals on the recovery accuracy of different models. Our results indicate that FC-DenseNet24, DWT, Mask R-CNN and SA-UNet are capable of identifying signals from very noisy ionograms (SNR < 1.4), overlapping signals can be well identified by DWT, Mask R-CNN and SA-UNet, and that more elongated signals are better identified by all models. MDPI 2022-04-02 /pmc/articles/PMC9002747/ /pubmed/35408372 http://dx.doi.org/10.3390/s22072758 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Chang, Yu-Chi Lin, Chia-Hsien Dmitriev, Alexei V. Hsieh, Mon-Chai Hsu, Hao-Wei Lin, Yu-Ciang Mendoza, Merlin M. Huang, Guan-Han Tsai, Lung-Chih Li, Yung-Hui Tsogtbaatar, Enkhtuya State-of-the-Art Capability of Convolutional Neural Networks to Distinguish the Signal in the Ionosphere |
title | State-of-the-Art Capability of Convolutional Neural Networks to Distinguish the Signal in the Ionosphere |
title_full | State-of-the-Art Capability of Convolutional Neural Networks to Distinguish the Signal in the Ionosphere |
title_fullStr | State-of-the-Art Capability of Convolutional Neural Networks to Distinguish the Signal in the Ionosphere |
title_full_unstemmed | State-of-the-Art Capability of Convolutional Neural Networks to Distinguish the Signal in the Ionosphere |
title_short | State-of-the-Art Capability of Convolutional Neural Networks to Distinguish the Signal in the Ionosphere |
title_sort | state-of-the-art capability of convolutional neural networks to distinguish the signal in the ionosphere |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9002747/ https://www.ncbi.nlm.nih.gov/pubmed/35408372 http://dx.doi.org/10.3390/s22072758 |
work_keys_str_mv | AT changyuchi stateoftheartcapabilityofconvolutionalneuralnetworkstodistinguishthesignalintheionosphere AT linchiahsien stateoftheartcapabilityofconvolutionalneuralnetworkstodistinguishthesignalintheionosphere AT dmitrievalexeiv stateoftheartcapabilityofconvolutionalneuralnetworkstodistinguishthesignalintheionosphere AT hsiehmonchai stateoftheartcapabilityofconvolutionalneuralnetworkstodistinguishthesignalintheionosphere AT hsuhaowei stateoftheartcapabilityofconvolutionalneuralnetworkstodistinguishthesignalintheionosphere AT linyuciang stateoftheartcapabilityofconvolutionalneuralnetworkstodistinguishthesignalintheionosphere AT mendozamerlinm stateoftheartcapabilityofconvolutionalneuralnetworkstodistinguishthesignalintheionosphere AT huangguanhan stateoftheartcapabilityofconvolutionalneuralnetworkstodistinguishthesignalintheionosphere AT tsailungchih stateoftheartcapabilityofconvolutionalneuralnetworkstodistinguishthesignalintheionosphere AT liyunghui stateoftheartcapabilityofconvolutionalneuralnetworkstodistinguishthesignalintheionosphere AT tsogtbaatarenkhtuya stateoftheartcapabilityofconvolutionalneuralnetworkstodistinguishthesignalintheionosphere |