Cargando…
Neural Data Transformer 2: Multi-context Pretraining for Neural Spiking Activity
The neural population spiking activity recorded by intracortical brain-computer interfaces (iBCIs) contain rich structure. Current models of such spiking activity are largely prepared for individual experimental contexts, restricting data volume to that collectable within a single session and limiti...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Cold Spring Harbor Laboratory
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10541112/ https://www.ncbi.nlm.nih.gov/pubmed/37781630 http://dx.doi.org/10.1101/2023.09.18.558113 |
_version_ | 1785113846930735104 |
---|---|
author | Ye, Joel Collinger, Jennifer L. Wehbe, Leila Gaunt, Robert |
author_facet | Ye, Joel Collinger, Jennifer L. Wehbe, Leila Gaunt, Robert |
author_sort | Ye, Joel |
collection | PubMed |
description | The neural population spiking activity recorded by intracortical brain-computer interfaces (iBCIs) contain rich structure. Current models of such spiking activity are largely prepared for individual experimental contexts, restricting data volume to that collectable within a single session and limiting the effectiveness of deep neural networks (DNNs). The purported challenge in aggregating neural spiking data is the pervasiveness of context-dependent shifts in the neural data distributions. However, large scale unsupervised pretraining by nature spans heterogeneous data, and has proven to be a fundamental recipe for successful representation learning across deep learning. We thus develop Neural Data Transformer 2 (NDT2), a spatiotemporal Transformer for neural spiking activity, and demonstrate that pretraining can leverage motor BCI datasets that span sessions, subjects, and experimental tasks. NDT2 enables rapid adaptation to novel contexts in downstream decoding tasks and opens the path to deployment of pretrained DNNs for iBCI control. Code: https://github.com/joel99/context_general_bci |
format | Online Article Text |
id | pubmed-10541112 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Cold Spring Harbor Laboratory |
record_format | MEDLINE/PubMed |
spelling | pubmed-105411122023-10-01 Neural Data Transformer 2: Multi-context Pretraining for Neural Spiking Activity Ye, Joel Collinger, Jennifer L. Wehbe, Leila Gaunt, Robert bioRxiv Article The neural population spiking activity recorded by intracortical brain-computer interfaces (iBCIs) contain rich structure. Current models of such spiking activity are largely prepared for individual experimental contexts, restricting data volume to that collectable within a single session and limiting the effectiveness of deep neural networks (DNNs). The purported challenge in aggregating neural spiking data is the pervasiveness of context-dependent shifts in the neural data distributions. However, large scale unsupervised pretraining by nature spans heterogeneous data, and has proven to be a fundamental recipe for successful representation learning across deep learning. We thus develop Neural Data Transformer 2 (NDT2), a spatiotemporal Transformer for neural spiking activity, and demonstrate that pretraining can leverage motor BCI datasets that span sessions, subjects, and experimental tasks. NDT2 enables rapid adaptation to novel contexts in downstream decoding tasks and opens the path to deployment of pretrained DNNs for iBCI control. Code: https://github.com/joel99/context_general_bci Cold Spring Harbor Laboratory 2023-09-22 /pmc/articles/PMC10541112/ /pubmed/37781630 http://dx.doi.org/10.1101/2023.09.18.558113 Text en https://creativecommons.org/licenses/by-nc/4.0/This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (https://creativecommons.org/licenses/by-nc/4.0/) , which allows reusers to distribute, remix, adapt, and build upon the material in any medium or format for noncommercial purposes only, and only so long as attribution is given to the creator. |
spellingShingle | Article Ye, Joel Collinger, Jennifer L. Wehbe, Leila Gaunt, Robert Neural Data Transformer 2: Multi-context Pretraining for Neural Spiking Activity |
title | Neural Data Transformer 2: Multi-context Pretraining for Neural Spiking Activity |
title_full | Neural Data Transformer 2: Multi-context Pretraining for Neural Spiking Activity |
title_fullStr | Neural Data Transformer 2: Multi-context Pretraining for Neural Spiking Activity |
title_full_unstemmed | Neural Data Transformer 2: Multi-context Pretraining for Neural Spiking Activity |
title_short | Neural Data Transformer 2: Multi-context Pretraining for Neural Spiking Activity |
title_sort | neural data transformer 2: multi-context pretraining for neural spiking activity |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10541112/ https://www.ncbi.nlm.nih.gov/pubmed/37781630 http://dx.doi.org/10.1101/2023.09.18.558113 |
work_keys_str_mv | AT yejoel neuraldatatransformer2multicontextpretrainingforneuralspikingactivity AT collingerjenniferl neuraldatatransformer2multicontextpretrainingforneuralspikingactivity AT wehbeleila neuraldatatransformer2multicontextpretrainingforneuralspikingactivity AT gauntrobert neuraldatatransformer2multicontextpretrainingforneuralspikingactivity |