Cargando…
Dynamics and Information Import in Recurrent Neural Networks
Recurrent neural networks (RNNs) are complex dynamical systems, capable of ongoing activity without any driving input. The long-term behavior of free-running RNNs, described by periodic, chaotic and fixed point attractors, is controlled by the statistics of the neural connection weights, such as the...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9091337/ https://www.ncbi.nlm.nih.gov/pubmed/35573264 http://dx.doi.org/10.3389/fncom.2022.876315 |
_version_ | 1784704897975844864 |
---|---|
author | Metzner, Claus Krauss, Patrick |
author_facet | Metzner, Claus Krauss, Patrick |
author_sort | Metzner, Claus |
collection | PubMed |
description | Recurrent neural networks (RNNs) are complex dynamical systems, capable of ongoing activity without any driving input. The long-term behavior of free-running RNNs, described by periodic, chaotic and fixed point attractors, is controlled by the statistics of the neural connection weights, such as the density d of non-zero connections, or the balance b between excitatory and inhibitory connections. However, for information processing purposes, RNNs need to receive external input signals, and it is not clear which of the dynamical regimes is optimal for this information import. We use both the average correlations C and the mutual information I between the momentary input vector and the next system state vector as quantitative measures of information import and analyze their dependence on the balance and density of the network. Remarkably, both resulting phase diagrams C(b, d) and I(b, d) are highly consistent, pointing to a link between the dynamical systems and the information-processing approach to complex systems. Information import is maximal not at the “edge of chaos,” which is optimally suited for computation, but surprisingly in the low-density chaotic regime and at the border between the chaotic and fixed point regime. Moreover, we find a completely new type of resonance phenomenon, which we call “Import Resonance” (IR), where the information import shows a maximum, i.e., a peak-like dependence on the coupling strength between the RNN and its external input. IR complements previously found Recurrence Resonance (RR), where correlation and mutual information of successive system states peak for a certain amplitude of noise added to the system. Both IR and RR can be exploited to optimize information processing in artificial neural networks and might also play a crucial role in biological neural systems. |
format | Online Article Text |
id | pubmed-9091337 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-90913372022-05-12 Dynamics and Information Import in Recurrent Neural Networks Metzner, Claus Krauss, Patrick Front Comput Neurosci Neuroscience Recurrent neural networks (RNNs) are complex dynamical systems, capable of ongoing activity without any driving input. The long-term behavior of free-running RNNs, described by periodic, chaotic and fixed point attractors, is controlled by the statistics of the neural connection weights, such as the density d of non-zero connections, or the balance b between excitatory and inhibitory connections. However, for information processing purposes, RNNs need to receive external input signals, and it is not clear which of the dynamical regimes is optimal for this information import. We use both the average correlations C and the mutual information I between the momentary input vector and the next system state vector as quantitative measures of information import and analyze their dependence on the balance and density of the network. Remarkably, both resulting phase diagrams C(b, d) and I(b, d) are highly consistent, pointing to a link between the dynamical systems and the information-processing approach to complex systems. Information import is maximal not at the “edge of chaos,” which is optimally suited for computation, but surprisingly in the low-density chaotic regime and at the border between the chaotic and fixed point regime. Moreover, we find a completely new type of resonance phenomenon, which we call “Import Resonance” (IR), where the information import shows a maximum, i.e., a peak-like dependence on the coupling strength between the RNN and its external input. IR complements previously found Recurrence Resonance (RR), where correlation and mutual information of successive system states peak for a certain amplitude of noise added to the system. Both IR and RR can be exploited to optimize information processing in artificial neural networks and might also play a crucial role in biological neural systems. Frontiers Media S.A. 2022-04-27 /pmc/articles/PMC9091337/ /pubmed/35573264 http://dx.doi.org/10.3389/fncom.2022.876315 Text en Copyright © 2022 Metzner and Krauss. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Metzner, Claus Krauss, Patrick Dynamics and Information Import in Recurrent Neural Networks |
title | Dynamics and Information Import in Recurrent Neural Networks |
title_full | Dynamics and Information Import in Recurrent Neural Networks |
title_fullStr | Dynamics and Information Import in Recurrent Neural Networks |
title_full_unstemmed | Dynamics and Information Import in Recurrent Neural Networks |
title_short | Dynamics and Information Import in Recurrent Neural Networks |
title_sort | dynamics and information import in recurrent neural networks |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9091337/ https://www.ncbi.nlm.nih.gov/pubmed/35573264 http://dx.doi.org/10.3389/fncom.2022.876315 |
work_keys_str_mv | AT metznerclaus dynamicsandinformationimportinrecurrentneuralnetworks AT krausspatrick dynamicsandinformationimportinrecurrentneuralnetworks |