Cargando…
BENDR: Using Transformers and a Contrastive Self-Supervised Learning Task to Learn From Massive Amounts of EEG Data
Deep neural networks (DNNs) used for brain–computer interface (BCI) classification are commonly expected to learn general features when trained across a variety of contexts, such that these features could be fine-tuned to specific contexts. While some success is found in such an approach, we suggest...
Autores principales: | Kostas, Demetres, Aroca-Ouellette, Stéphane, Rudzicz, Frank |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8261053/ https://www.ncbi.nlm.nih.gov/pubmed/34248521 http://dx.doi.org/10.3389/fnhum.2021.653659 |
Ejemplares similares
-
Machine learning for MEG during speech tasks
por: Kostas, Demetres, et al.
Publicado: (2019) -
Multibranch convolutional neural network with contrastive representation learning for decoding same limb motor imagery tasks
por: Phunruangsakao, Chatrin, et al.
Publicado: (2022) -
Survey on Self-Supervised Learning: Auxiliary Pretext Tasks and Contrastive Learning Methods in Imaging
por: Albelwi, Saleh
Publicado: (2022) -
Application of multi-task transfer learning: The combination of EA and optimized subband regularized CSP to classification of 8-channel EEG signals with small dataset
por: Long, Taixue, et al.
Publicado: (2023) -
Cross-Dataset Variability Problem in EEG Decoding With Deep Learning
por: Xu, Lichao, et al.
Publicado: (2020)