Cargando…

On reaction network implementations of neural networks

This paper is concerned with the utilization of deterministically modelled chemical reaction networks for the implementation of (feed-forward) neural networks. We develop a general mathematical framework and prove that the ordinary differential equations (ODEs) associated with certain reaction netwo...

Descripción completa

Detalles Bibliográficos
Autores principales: Anderson, David F., Joshi, Badal, Deshpande, Abhishek
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Royal Society 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8086923/
https://www.ncbi.nlm.nih.gov/pubmed/33849332
http://dx.doi.org/10.1098/rsif.2021.0031
_version_ 1783686589209116672
author Anderson, David F.
Joshi, Badal
Deshpande, Abhishek
author_facet Anderson, David F.
Joshi, Badal
Deshpande, Abhishek
author_sort Anderson, David F.
collection PubMed
description This paper is concerned with the utilization of deterministically modelled chemical reaction networks for the implementation of (feed-forward) neural networks. We develop a general mathematical framework and prove that the ordinary differential equations (ODEs) associated with certain reaction network implementations of neural networks have desirable properties including (i) existence of unique positive fixed points that are smooth in the parameters of the model (necessary for gradient descent) and (ii) fast convergence to the fixed point regardless of initial condition (necessary for efficient implementation). We do so by first making a connection between neural networks and fixed points for systems of ODEs, and then by constructing reaction networks with the correct associated set of ODEs. We demonstrate the theory by constructing a reaction network that implements a neural network with a smoothed ReLU activation function, though we also demonstrate how to generalize the construction to allow for other activation functions (each with the desirable properties listed previously). As there are multiple types of ‘networks’ used in this paper, we also give a careful introduction to both reaction networks and neural networks, in order to disambiguate the overlapping vocabulary in the two settings and to clearly highlight the role of each network’s properties.
format Online
Article
Text
id pubmed-8086923
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher The Royal Society
record_format MEDLINE/PubMed
spelling pubmed-80869232021-05-21 On reaction network implementations of neural networks Anderson, David F. Joshi, Badal Deshpande, Abhishek J R Soc Interface Life Sciences–Mathematics interface This paper is concerned with the utilization of deterministically modelled chemical reaction networks for the implementation of (feed-forward) neural networks. We develop a general mathematical framework and prove that the ordinary differential equations (ODEs) associated with certain reaction network implementations of neural networks have desirable properties including (i) existence of unique positive fixed points that are smooth in the parameters of the model (necessary for gradient descent) and (ii) fast convergence to the fixed point regardless of initial condition (necessary for efficient implementation). We do so by first making a connection between neural networks and fixed points for systems of ODEs, and then by constructing reaction networks with the correct associated set of ODEs. We demonstrate the theory by constructing a reaction network that implements a neural network with a smoothed ReLU activation function, though we also demonstrate how to generalize the construction to allow for other activation functions (each with the desirable properties listed previously). As there are multiple types of ‘networks’ used in this paper, we also give a careful introduction to both reaction networks and neural networks, in order to disambiguate the overlapping vocabulary in the two settings and to clearly highlight the role of each network’s properties. The Royal Society 2021-04-14 /pmc/articles/PMC8086923/ /pubmed/33849332 http://dx.doi.org/10.1098/rsif.2021.0031 Text en © 2021 The Authors. https://creativecommons.org/licenses/by/4.0/Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, provided the original author and source are credited.
spellingShingle Life Sciences–Mathematics interface
Anderson, David F.
Joshi, Badal
Deshpande, Abhishek
On reaction network implementations of neural networks
title On reaction network implementations of neural networks
title_full On reaction network implementations of neural networks
title_fullStr On reaction network implementations of neural networks
title_full_unstemmed On reaction network implementations of neural networks
title_short On reaction network implementations of neural networks
title_sort on reaction network implementations of neural networks
topic Life Sciences–Mathematics interface
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8086923/
https://www.ncbi.nlm.nih.gov/pubmed/33849332
http://dx.doi.org/10.1098/rsif.2021.0031
work_keys_str_mv AT andersondavidf onreactionnetworkimplementationsofneuralnetworks
AT joshibadal onreactionnetworkimplementationsofneuralnetworks
AT deshpandeabhishek onreactionnetworkimplementationsofneuralnetworks