Cargando…

Assigning channel weights using an attention mechanism: an EEG interpolation algorithm

During the acquisition of electroencephalographic (EEG) signals, various factors can influence the data and lead to the presence of one or multiple bad channels. Bad channel interpolation is the use of good channels data to reconstruct bad channel, thereby maintaining the original dimensions of the...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Renjie, Wang, Zaijun, Qiu, Jiang, Wang, Xue
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10552919/
https://www.ncbi.nlm.nih.gov/pubmed/37811329
http://dx.doi.org/10.3389/fnins.2023.1251677
_version_ 1785116056513150976
author Liu, Renjie
Wang, Zaijun
Qiu, Jiang
Wang, Xue
author_facet Liu, Renjie
Wang, Zaijun
Qiu, Jiang
Wang, Xue
author_sort Liu, Renjie
collection PubMed
description During the acquisition of electroencephalographic (EEG) signals, various factors can influence the data and lead to the presence of one or multiple bad channels. Bad channel interpolation is the use of good channels data to reconstruct bad channel, thereby maintaining the original dimensions of the data for subsequent analysis tasks. The mainstream interpolation algorithm assigns weights to channels based on the physical distance of the electrodes and does not take into account the effect of physiological factors on the EEG signal. The algorithm proposed in this study utilizes an attention mechanism to allocate channel weights (AMACW). The model gets the correlation among channels by learning from good channel data. Interpolation assigns weights based on learned correlations without the need for electrode location information, solving the difficulty that traditional methods cannot interpolate bad channels at unknown locations. To avoid an overly concentrated weight distribution of the model when generating data, we designed the channel masking (CM). This method spreads attention and allows the model to utilize data from multiple channels. We evaluate the reconstruction performance of the model using EEG data with 1 to 5 bad channels. With EEGLAB’s interpolation method as a performance reference, tests have shown that the AMACW models can effectively reconstruct bad channels.
format Online
Article
Text
id pubmed-10552919
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-105529192023-10-06 Assigning channel weights using an attention mechanism: an EEG interpolation algorithm Liu, Renjie Wang, Zaijun Qiu, Jiang Wang, Xue Front Neurosci Neuroscience During the acquisition of electroencephalographic (EEG) signals, various factors can influence the data and lead to the presence of one or multiple bad channels. Bad channel interpolation is the use of good channels data to reconstruct bad channel, thereby maintaining the original dimensions of the data for subsequent analysis tasks. The mainstream interpolation algorithm assigns weights to channels based on the physical distance of the electrodes and does not take into account the effect of physiological factors on the EEG signal. The algorithm proposed in this study utilizes an attention mechanism to allocate channel weights (AMACW). The model gets the correlation among channels by learning from good channel data. Interpolation assigns weights based on learned correlations without the need for electrode location information, solving the difficulty that traditional methods cannot interpolate bad channels at unknown locations. To avoid an overly concentrated weight distribution of the model when generating data, we designed the channel masking (CM). This method spreads attention and allows the model to utilize data from multiple channels. We evaluate the reconstruction performance of the model using EEG data with 1 to 5 bad channels. With EEGLAB’s interpolation method as a performance reference, tests have shown that the AMACW models can effectively reconstruct bad channels. Frontiers Media S.A. 2023-09-21 /pmc/articles/PMC10552919/ /pubmed/37811329 http://dx.doi.org/10.3389/fnins.2023.1251677 Text en Copyright © 2023 Liu, Wang, Qiu and Wang. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Liu, Renjie
Wang, Zaijun
Qiu, Jiang
Wang, Xue
Assigning channel weights using an attention mechanism: an EEG interpolation algorithm
title Assigning channel weights using an attention mechanism: an EEG interpolation algorithm
title_full Assigning channel weights using an attention mechanism: an EEG interpolation algorithm
title_fullStr Assigning channel weights using an attention mechanism: an EEG interpolation algorithm
title_full_unstemmed Assigning channel weights using an attention mechanism: an EEG interpolation algorithm
title_short Assigning channel weights using an attention mechanism: an EEG interpolation algorithm
title_sort assigning channel weights using an attention mechanism: an eeg interpolation algorithm
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10552919/
https://www.ncbi.nlm.nih.gov/pubmed/37811329
http://dx.doi.org/10.3389/fnins.2023.1251677
work_keys_str_mv AT liurenjie assigningchannelweightsusinganattentionmechanismaneeginterpolationalgorithm
AT wangzaijun assigningchannelweightsusinganattentionmechanismaneeginterpolationalgorithm
AT qiujiang assigningchannelweightsusinganattentionmechanismaneeginterpolationalgorithm
AT wangxue assigningchannelweightsusinganattentionmechanismaneeginterpolationalgorithm