Cargando…

Energy-based analog neural network framework

Over the past decade a body of work has emerged and shown the disruptive potential of neuromorphic systems across a broad range of studies, often combining novel machine learning models and nanotechnologies. Still, the scope of investigations often remains limited to simple problems since the proces...

Descripción completa

Detalles Bibliográficos
Autores principales: Watfa, Mohamed, Garcia-Ortiz, Alberto, Sassatelli, Gilles
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10020340/
https://www.ncbi.nlm.nih.gov/pubmed/36936192
http://dx.doi.org/10.3389/fncom.2023.1114651
_version_ 1784908236575473664
author Watfa, Mohamed
Garcia-Ortiz, Alberto
Sassatelli, Gilles
author_facet Watfa, Mohamed
Garcia-Ortiz, Alberto
Sassatelli, Gilles
author_sort Watfa, Mohamed
collection PubMed
description Over the past decade a body of work has emerged and shown the disruptive potential of neuromorphic systems across a broad range of studies, often combining novel machine learning models and nanotechnologies. Still, the scope of investigations often remains limited to simple problems since the process of building, training, and evaluating mixed-signal neural models is slow and laborious. In this paper, we introduce an open-source framework, called EBANA, that provides a unified, modularized, and extensible infrastructure, similar to conventional machine learning pipelines, for building and validating analog neural networks (ANNs). It uses Python as interface language with a syntax similar to Keras, while hiding the complexity of the underlying analog simulations. It already includes the most common building blocks and maintains sufficient modularity and extensibility to easily incorporate new concepts, electrical, and technological models. These features make EBANA suitable for researchers and practitioners to experiment with different design topologies and explore the various tradeoffs that exist in the design space. We illustrate the framework capabilities by elaborating on the increasingly popular Energy-Based Models (EBMs), used in conjunction with the local Equilibrium Propagation (EP) training algorithm. Our experiments cover 3 datasets having up to 60,000 entries and explore network topologies generating circuits in excess of 1,000 electrical nodes that can be extensively benchmarked with ease and in reasonable time thanks to the native EBANA parallelization capability.
format Online
Article
Text
id pubmed-10020340
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-100203402023-03-18 Energy-based analog neural network framework Watfa, Mohamed Garcia-Ortiz, Alberto Sassatelli, Gilles Front Comput Neurosci Neuroscience Over the past decade a body of work has emerged and shown the disruptive potential of neuromorphic systems across a broad range of studies, often combining novel machine learning models and nanotechnologies. Still, the scope of investigations often remains limited to simple problems since the process of building, training, and evaluating mixed-signal neural models is slow and laborious. In this paper, we introduce an open-source framework, called EBANA, that provides a unified, modularized, and extensible infrastructure, similar to conventional machine learning pipelines, for building and validating analog neural networks (ANNs). It uses Python as interface language with a syntax similar to Keras, while hiding the complexity of the underlying analog simulations. It already includes the most common building blocks and maintains sufficient modularity and extensibility to easily incorporate new concepts, electrical, and technological models. These features make EBANA suitable for researchers and practitioners to experiment with different design topologies and explore the various tradeoffs that exist in the design space. We illustrate the framework capabilities by elaborating on the increasingly popular Energy-Based Models (EBMs), used in conjunction with the local Equilibrium Propagation (EP) training algorithm. Our experiments cover 3 datasets having up to 60,000 entries and explore network topologies generating circuits in excess of 1,000 electrical nodes that can be extensively benchmarked with ease and in reasonable time thanks to the native EBANA parallelization capability. Frontiers Media S.A. 2023-03-03 /pmc/articles/PMC10020340/ /pubmed/36936192 http://dx.doi.org/10.3389/fncom.2023.1114651 Text en Copyright © 2023 Watfa, Garcia-Ortiz and Sassatelli. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Watfa, Mohamed
Garcia-Ortiz, Alberto
Sassatelli, Gilles
Energy-based analog neural network framework
title Energy-based analog neural network framework
title_full Energy-based analog neural network framework
title_fullStr Energy-based analog neural network framework
title_full_unstemmed Energy-based analog neural network framework
title_short Energy-based analog neural network framework
title_sort energy-based analog neural network framework
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10020340/
https://www.ncbi.nlm.nih.gov/pubmed/36936192
http://dx.doi.org/10.3389/fncom.2023.1114651
work_keys_str_mv AT watfamohamed energybasedanalogneuralnetworkframework
AT garciaortizalberto energybasedanalogneuralnetworkframework
AT sassatelligilles energybasedanalogneuralnetworkframework