Cargando…

Adversarial attacks on spiking convolutional neural networks for event-based vision

Event-based dynamic vision sensors provide very sparse output in the form of spikes, which makes them suitable for low-power applications. Convolutional spiking neural networks model such event-based data and develop their full energy-saving potential when deployed on asynchronous neuromorphic hardw...

Descripción completa

Detalles Bibliográficos
Autores principales: Büchel, Julian, Lenz, Gregor, Hu, Yalun, Sheik, Sadique, Sorbaro, Martino
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9831110/
https://www.ncbi.nlm.nih.gov/pubmed/36636576
http://dx.doi.org/10.3389/fnins.2022.1068193
_version_ 1784867799801266176
author Büchel, Julian
Lenz, Gregor
Hu, Yalun
Sheik, Sadique
Sorbaro, Martino
author_facet Büchel, Julian
Lenz, Gregor
Hu, Yalun
Sheik, Sadique
Sorbaro, Martino
author_sort Büchel, Julian
collection PubMed
description Event-based dynamic vision sensors provide very sparse output in the form of spikes, which makes them suitable for low-power applications. Convolutional spiking neural networks model such event-based data and develop their full energy-saving potential when deployed on asynchronous neuromorphic hardware. Event-based vision being a nascent field, the sensitivity of spiking neural networks to potentially malicious adversarial attacks has received little attention so far. We show how white-box adversarial attack algorithms can be adapted to the discrete and sparse nature of event-based visual data, and demonstrate smaller perturbation magnitudes at higher success rates than the current state-of-the-art algorithms. For the first time, we also verify the effectiveness of these perturbations directly on neuromorphic hardware. Finally, we discuss the properties of the resulting perturbations, the effect of adversarial training as a defense strategy, and future directions.
format Online
Article
Text
id pubmed-9831110
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-98311102023-01-11 Adversarial attacks on spiking convolutional neural networks for event-based vision Büchel, Julian Lenz, Gregor Hu, Yalun Sheik, Sadique Sorbaro, Martino Front Neurosci Neuroscience Event-based dynamic vision sensors provide very sparse output in the form of spikes, which makes them suitable for low-power applications. Convolutional spiking neural networks model such event-based data and develop their full energy-saving potential when deployed on asynchronous neuromorphic hardware. Event-based vision being a nascent field, the sensitivity of spiking neural networks to potentially malicious adversarial attacks has received little attention so far. We show how white-box adversarial attack algorithms can be adapted to the discrete and sparse nature of event-based visual data, and demonstrate smaller perturbation magnitudes at higher success rates than the current state-of-the-art algorithms. For the first time, we also verify the effectiveness of these perturbations directly on neuromorphic hardware. Finally, we discuss the properties of the resulting perturbations, the effect of adversarial training as a defense strategy, and future directions. Frontiers Media S.A. 2022-12-22 /pmc/articles/PMC9831110/ /pubmed/36636576 http://dx.doi.org/10.3389/fnins.2022.1068193 Text en Copyright © 2022 Büchel, Lenz, Hu, Sheik and Sorbaro. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Büchel, Julian
Lenz, Gregor
Hu, Yalun
Sheik, Sadique
Sorbaro, Martino
Adversarial attacks on spiking convolutional neural networks for event-based vision
title Adversarial attacks on spiking convolutional neural networks for event-based vision
title_full Adversarial attacks on spiking convolutional neural networks for event-based vision
title_fullStr Adversarial attacks on spiking convolutional neural networks for event-based vision
title_full_unstemmed Adversarial attacks on spiking convolutional neural networks for event-based vision
title_short Adversarial attacks on spiking convolutional neural networks for event-based vision
title_sort adversarial attacks on spiking convolutional neural networks for event-based vision
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9831110/
https://www.ncbi.nlm.nih.gov/pubmed/36636576
http://dx.doi.org/10.3389/fnins.2022.1068193
work_keys_str_mv AT bucheljulian adversarialattacksonspikingconvolutionalneuralnetworksforeventbasedvision
AT lenzgregor adversarialattacksonspikingconvolutionalneuralnetworksforeventbasedvision
AT huyalun adversarialattacksonspikingconvolutionalneuralnetworksforeventbasedvision
AT sheiksadique adversarialattacksonspikingconvolutionalneuralnetworksforeventbasedvision
AT sorbaromartino adversarialattacksonspikingconvolutionalneuralnetworksforeventbasedvision