Cargando…

Enhancing spiking neural networks with hybrid top-down attention

As the representatives of brain-inspired models at the neuronal level, spiking neural networks (SNNs) have shown great promise in processing spatiotemporal information with intrinsic temporal dynamics. SNNs are expected to further improve their robustness and computing efficiency by introducing top-...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Faqiang, Zhao, Rong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9443487/
https://www.ncbi.nlm.nih.gov/pubmed/36071719
http://dx.doi.org/10.3389/fnins.2022.949142
_version_ 1784782993809735680
author Liu, Faqiang
Zhao, Rong
author_facet Liu, Faqiang
Zhao, Rong
author_sort Liu, Faqiang
collection PubMed
description As the representatives of brain-inspired models at the neuronal level, spiking neural networks (SNNs) have shown great promise in processing spatiotemporal information with intrinsic temporal dynamics. SNNs are expected to further improve their robustness and computing efficiency by introducing top-down attention at the architectural level, which is crucial for the human brain to support advanced intelligence. However, this attempt encounters difficulties in optimizing the attention in SNNs largely due to the lack of annotations. Here, we develop a hybrid network model with a top-down attention mechanism (HTDA) by incorporating an artificial neural network (ANN) to generate attention maps based on the features extracted by a feedforward SNN. The attention map is then used to modulate the encoding layer of the SNN so that it focuses on the most informative sensory input. To facilitate direct learning of attention maps and avoid labor-intensive annotations, we propose a general principle and a corresponding weakly-supervised objective, which promotes the HTDA model to utilize an integral and small subset of the input to give accurate predictions. On this basis, the ANN and the SNN can be jointly optimized by surrogate gradient descent in an end-to-end manner. We comprehensively evaluated the HTDA model on object recognition tasks, which demonstrates strong robustness to adversarial noise, high computing efficiency, and good interpretability. On the widely-adopted CIFAR-10, CIFAR-100, and MNIST benchmarks, the HTDA model reduces firing rates by up to 50% and improves adversarial robustness by up to 10% with comparable or better accuracy compared with the state-of-the-art SNNs. The HTDA model is also verified on dynamic neuromorphic datasets and achieves consistent improvements. This study provides a new way to boost the performance of SNNs by employing a hybrid top-down attention mechanism.
format Online
Article
Text
id pubmed-9443487
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-94434872022-09-06 Enhancing spiking neural networks with hybrid top-down attention Liu, Faqiang Zhao, Rong Front Neurosci Neuroscience As the representatives of brain-inspired models at the neuronal level, spiking neural networks (SNNs) have shown great promise in processing spatiotemporal information with intrinsic temporal dynamics. SNNs are expected to further improve their robustness and computing efficiency by introducing top-down attention at the architectural level, which is crucial for the human brain to support advanced intelligence. However, this attempt encounters difficulties in optimizing the attention in SNNs largely due to the lack of annotations. Here, we develop a hybrid network model with a top-down attention mechanism (HTDA) by incorporating an artificial neural network (ANN) to generate attention maps based on the features extracted by a feedforward SNN. The attention map is then used to modulate the encoding layer of the SNN so that it focuses on the most informative sensory input. To facilitate direct learning of attention maps and avoid labor-intensive annotations, we propose a general principle and a corresponding weakly-supervised objective, which promotes the HTDA model to utilize an integral and small subset of the input to give accurate predictions. On this basis, the ANN and the SNN can be jointly optimized by surrogate gradient descent in an end-to-end manner. We comprehensively evaluated the HTDA model on object recognition tasks, which demonstrates strong robustness to adversarial noise, high computing efficiency, and good interpretability. On the widely-adopted CIFAR-10, CIFAR-100, and MNIST benchmarks, the HTDA model reduces firing rates by up to 50% and improves adversarial robustness by up to 10% with comparable or better accuracy compared with the state-of-the-art SNNs. The HTDA model is also verified on dynamic neuromorphic datasets and achieves consistent improvements. This study provides a new way to boost the performance of SNNs by employing a hybrid top-down attention mechanism. Frontiers Media S.A. 2022-08-22 /pmc/articles/PMC9443487/ /pubmed/36071719 http://dx.doi.org/10.3389/fnins.2022.949142 Text en Copyright © 2022 Liu and Zhao. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Liu, Faqiang
Zhao, Rong
Enhancing spiking neural networks with hybrid top-down attention
title Enhancing spiking neural networks with hybrid top-down attention
title_full Enhancing spiking neural networks with hybrid top-down attention
title_fullStr Enhancing spiking neural networks with hybrid top-down attention
title_full_unstemmed Enhancing spiking neural networks with hybrid top-down attention
title_short Enhancing spiking neural networks with hybrid top-down attention
title_sort enhancing spiking neural networks with hybrid top-down attention
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9443487/
https://www.ncbi.nlm.nih.gov/pubmed/36071719
http://dx.doi.org/10.3389/fnins.2022.949142
work_keys_str_mv AT liufaqiang enhancingspikingneuralnetworkswithhybridtopdownattention
AT zhaorong enhancingspikingneuralnetworkswithhybridtopdownattention