Cargando…
A neurophysiologically interpretable deep neural network predicts complex movement components from brain activity
The effective decoding of movement from non-invasive electroencephalography (EEG) is essential for informing several therapeutic interventions, from neurorehabilitation robots to neural prosthetics. Deep neural networks are most suitable for decoding real-time data but their use in EEG is hindered b...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8776813/ https://www.ncbi.nlm.nih.gov/pubmed/35058514 http://dx.doi.org/10.1038/s41598-022-05079-0 |
_version_ | 1784636919186980864 |
---|---|
author | Kumar, Neelesh Michmizos, Konstantinos P. |
author_facet | Kumar, Neelesh Michmizos, Konstantinos P. |
author_sort | Kumar, Neelesh |
collection | PubMed |
description | The effective decoding of movement from non-invasive electroencephalography (EEG) is essential for informing several therapeutic interventions, from neurorehabilitation robots to neural prosthetics. Deep neural networks are most suitable for decoding real-time data but their use in EEG is hindered by the gross classes of motor tasks in the currently available datasets, which are solvable even with network architectures that do not require specialized design considerations. Moreover, the weak association with the underlying neurophysiology limits the generalizability of modern networks for EEG inference. Here, we present a neurophysiologically interpretable 3-dimensional convolutional neural network (3D-CNN) that captured the spatiotemporal dependencies in brain areas that get co-activated during movement. The 3D-CNN received topography-preserving EEG inputs, and predicted complex components of hand movements performed on a plane using a back-drivable rehabilitation robot, namely (a) the reaction time (RT) for responding to stimulus (slow or fast), (b) the mode of movement (active or passive, depending on whether there was an assistive force provided by the apparatus), and (c) the orthogonal directions of the movement (left, right, up, or down). We validated the 3D-CNN on a new dataset that we acquired from an in-house motor experiment, where it achieved average leave-one-subject-out test accuracies of 79.81%, 81.23%, and 82.00% for RT, active vs. passive, and direction classifications, respectively. Our proposed method outperformed the modern 2D-CNN architecture by a range of 1.1% to 6.74% depending on the classification task. Further, we identified the EEG sensors and time segments crucial to the classification decisions of the network, which aligned well with the current neurophysiological knowledge on brain activity in motor planning and execution tasks. Our results demonstrate the importance of biological relevance in networks for an accurate decoding of EEG, suggesting that the real-time classification of other complex brain activities may now be within our reach. |
format | Online Article Text |
id | pubmed-8776813 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-87768132022-01-24 A neurophysiologically interpretable deep neural network predicts complex movement components from brain activity Kumar, Neelesh Michmizos, Konstantinos P. Sci Rep Article The effective decoding of movement from non-invasive electroencephalography (EEG) is essential for informing several therapeutic interventions, from neurorehabilitation robots to neural prosthetics. Deep neural networks are most suitable for decoding real-time data but their use in EEG is hindered by the gross classes of motor tasks in the currently available datasets, which are solvable even with network architectures that do not require specialized design considerations. Moreover, the weak association with the underlying neurophysiology limits the generalizability of modern networks for EEG inference. Here, we present a neurophysiologically interpretable 3-dimensional convolutional neural network (3D-CNN) that captured the spatiotemporal dependencies in brain areas that get co-activated during movement. The 3D-CNN received topography-preserving EEG inputs, and predicted complex components of hand movements performed on a plane using a back-drivable rehabilitation robot, namely (a) the reaction time (RT) for responding to stimulus (slow or fast), (b) the mode of movement (active or passive, depending on whether there was an assistive force provided by the apparatus), and (c) the orthogonal directions of the movement (left, right, up, or down). We validated the 3D-CNN on a new dataset that we acquired from an in-house motor experiment, where it achieved average leave-one-subject-out test accuracies of 79.81%, 81.23%, and 82.00% for RT, active vs. passive, and direction classifications, respectively. Our proposed method outperformed the modern 2D-CNN architecture by a range of 1.1% to 6.74% depending on the classification task. Further, we identified the EEG sensors and time segments crucial to the classification decisions of the network, which aligned well with the current neurophysiological knowledge on brain activity in motor planning and execution tasks. Our results demonstrate the importance of biological relevance in networks for an accurate decoding of EEG, suggesting that the real-time classification of other complex brain activities may now be within our reach. Nature Publishing Group UK 2022-01-20 /pmc/articles/PMC8776813/ /pubmed/35058514 http://dx.doi.org/10.1038/s41598-022-05079-0 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Kumar, Neelesh Michmizos, Konstantinos P. A neurophysiologically interpretable deep neural network predicts complex movement components from brain activity |
title | A neurophysiologically interpretable deep neural network predicts complex movement components from brain activity |
title_full | A neurophysiologically interpretable deep neural network predicts complex movement components from brain activity |
title_fullStr | A neurophysiologically interpretable deep neural network predicts complex movement components from brain activity |
title_full_unstemmed | A neurophysiologically interpretable deep neural network predicts complex movement components from brain activity |
title_short | A neurophysiologically interpretable deep neural network predicts complex movement components from brain activity |
title_sort | neurophysiologically interpretable deep neural network predicts complex movement components from brain activity |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8776813/ https://www.ncbi.nlm.nih.gov/pubmed/35058514 http://dx.doi.org/10.1038/s41598-022-05079-0 |
work_keys_str_mv | AT kumarneelesh aneurophysiologicallyinterpretabledeepneuralnetworkpredictscomplexmovementcomponentsfrombrainactivity AT michmizoskonstantinosp aneurophysiologicallyinterpretabledeepneuralnetworkpredictscomplexmovementcomponentsfrombrainactivity AT kumarneelesh neurophysiologicallyinterpretabledeepneuralnetworkpredictscomplexmovementcomponentsfrombrainactivity AT michmizoskonstantinosp neurophysiologicallyinterpretabledeepneuralnetworkpredictscomplexmovementcomponentsfrombrainactivity |