Cargando…
Dynamical complexity and computation in recurrent neural networks beyond their fixed point
Spontaneous activity found in neural networks usually results in a reduction of computational performance. As a consequence, artificial neural networks are often operated at the edge of chaos, where the network is stable yet highly susceptible to input information. Surprisingly, regular spontaneous...
Autores principales: | Marquez, Bicky A., Larger, Laurent, Jacquot, Maxime, Chembo, Yanne K., Brunner, Daniel |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5820323/ https://www.ncbi.nlm.nih.gov/pubmed/29463810 http://dx.doi.org/10.1038/s41598-018-21624-2 |
Ejemplares similares
-
Slow points and adiabatic fixed points in recurrent neural networks
por: Wernecke, Hendrik, et al.
Publicado: (2015) -
Interval Methods for Seeking Fixed Points of Recurrent Neural Networks
por: Kubica, Bartłomiej Jacek, et al.
Publicado: (2020) -
The computation of fixed points and applications
por: Todd, Michael J
Publicado: (1976) -
Pattern Recognition in Neural Networks with Competing Dynamics: Coexistence of Fixed-Point and Cyclic Attractors
por: Herrera-Aguilar, José L., et al.
Publicado: (2012) -
Computation of fixed points in a circular machine
por: Verdier, A, et al.
Publicado: (1997)