Cargando…

Multi-scale ResNet and BiGRU automatic sleep staging based on attention mechanism

Sleep staging is the basis of sleep evaluation and a key step in the diagnosis of sleep-related diseases. Despite being useful, the existing sleep staging methods have several disadvantages, such as relying on artificial feature extraction, failing to recognize temporal sequence patterns in the long...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Changyuan, Yin, Yunfu, Sun, Yuhan, Ersoy, Okan K.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9202858/
https://www.ncbi.nlm.nih.gov/pubmed/35709101
http://dx.doi.org/10.1371/journal.pone.0269500
_version_ 1784728600602214400
author Liu, Changyuan
Yin, Yunfu
Sun, Yuhan
Ersoy, Okan K.
author_facet Liu, Changyuan
Yin, Yunfu
Sun, Yuhan
Ersoy, Okan K.
author_sort Liu, Changyuan
collection PubMed
description Sleep staging is the basis of sleep evaluation and a key step in the diagnosis of sleep-related diseases. Despite being useful, the existing sleep staging methods have several disadvantages, such as relying on artificial feature extraction, failing to recognize temporal sequence patterns in the long-term associated data, and reaching the accuracy upper limit of sleep staging. Hence, this paper proposes an automatic Electroencephalogram (EEG) sleep signal staging model, which based on Multi-scale Attention Residual Nets (MAResnet) and Bidirectional Gated Recurrent Unit (BiGRU). The proposed model is based on the residual neural network in deep learning. Compared with the traditional residual learning module, the proposed model additionally uses the improved channel and spatial feature attention units and convolution kernels of different sizes in parallel at the same position. Thus, multiscale feature extraction of the EEG sleep signals and residual learning of the neural networks is performed to avoid network degradation. Finally, BiGRU is used to determine the dependence between the sleep stages and to realize the automatic learning of sleep data staging features and sleep cycle extraction. According to the experiment, the classification accuracy and kappa coefficient of the proposed method on sleep-EDF data set are 84.24% and 0.78, which are respectively 0.24% and 0.21 higher than the traditional residual net. At the same time, this paper also verified the proposed method on UCD and SHHS data sets, and the figure of classification accuracy is 79.34% and 81.6%, respectively. Compared to related existing studies, the recognition accuracy is significantly improved, which validates the effectiveness and generalization performance of the proposed method.
format Online
Article
Text
id pubmed-9202858
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-92028582022-06-17 Multi-scale ResNet and BiGRU automatic sleep staging based on attention mechanism Liu, Changyuan Yin, Yunfu Sun, Yuhan Ersoy, Okan K. PLoS One Research Article Sleep staging is the basis of sleep evaluation and a key step in the diagnosis of sleep-related diseases. Despite being useful, the existing sleep staging methods have several disadvantages, such as relying on artificial feature extraction, failing to recognize temporal sequence patterns in the long-term associated data, and reaching the accuracy upper limit of sleep staging. Hence, this paper proposes an automatic Electroencephalogram (EEG) sleep signal staging model, which based on Multi-scale Attention Residual Nets (MAResnet) and Bidirectional Gated Recurrent Unit (BiGRU). The proposed model is based on the residual neural network in deep learning. Compared with the traditional residual learning module, the proposed model additionally uses the improved channel and spatial feature attention units and convolution kernels of different sizes in parallel at the same position. Thus, multiscale feature extraction of the EEG sleep signals and residual learning of the neural networks is performed to avoid network degradation. Finally, BiGRU is used to determine the dependence between the sleep stages and to realize the automatic learning of sleep data staging features and sleep cycle extraction. According to the experiment, the classification accuracy and kappa coefficient of the proposed method on sleep-EDF data set are 84.24% and 0.78, which are respectively 0.24% and 0.21 higher than the traditional residual net. At the same time, this paper also verified the proposed method on UCD and SHHS data sets, and the figure of classification accuracy is 79.34% and 81.6%, respectively. Compared to related existing studies, the recognition accuracy is significantly improved, which validates the effectiveness and generalization performance of the proposed method. Public Library of Science 2022-06-16 /pmc/articles/PMC9202858/ /pubmed/35709101 http://dx.doi.org/10.1371/journal.pone.0269500 Text en © 2022 Liu et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Liu, Changyuan
Yin, Yunfu
Sun, Yuhan
Ersoy, Okan K.
Multi-scale ResNet and BiGRU automatic sleep staging based on attention mechanism
title Multi-scale ResNet and BiGRU automatic sleep staging based on attention mechanism
title_full Multi-scale ResNet and BiGRU automatic sleep staging based on attention mechanism
title_fullStr Multi-scale ResNet and BiGRU automatic sleep staging based on attention mechanism
title_full_unstemmed Multi-scale ResNet and BiGRU automatic sleep staging based on attention mechanism
title_short Multi-scale ResNet and BiGRU automatic sleep staging based on attention mechanism
title_sort multi-scale resnet and bigru automatic sleep staging based on attention mechanism
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9202858/
https://www.ncbi.nlm.nih.gov/pubmed/35709101
http://dx.doi.org/10.1371/journal.pone.0269500
work_keys_str_mv AT liuchangyuan multiscaleresnetandbigruautomaticsleepstagingbasedonattentionmechanism
AT yinyunfu multiscaleresnetandbigruautomaticsleepstagingbasedonattentionmechanism
AT sunyuhan multiscaleresnetandbigruautomaticsleepstagingbasedonattentionmechanism
AT ersoyokank multiscaleresnetandbigruautomaticsleepstagingbasedonattentionmechanism