Cargando…

Dynamical complexity and computation in recurrent neural networks beyond their fixed point

Spontaneous activity found in neural networks usually results in a reduction of computational performance. As a consequence, artificial neural networks are often operated at the edge of chaos, where the network is stable yet highly susceptible to input information. Surprisingly, regular spontaneous...

Descripción completa

Detalles Bibliográficos
Autores principales: Marquez, Bicky A., Larger, Laurent, Jacquot, Maxime, Chembo, Yanne K., Brunner, Daniel
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5820323/
https://www.ncbi.nlm.nih.gov/pubmed/29463810
http://dx.doi.org/10.1038/s41598-018-21624-2
Descripción
Sumario:Spontaneous activity found in neural networks usually results in a reduction of computational performance. As a consequence, artificial neural networks are often operated at the edge of chaos, where the network is stable yet highly susceptible to input information. Surprisingly, regular spontaneous dynamics in Neural Networks beyond their resting state possess a high degree of spatio-temporal synchronization, a situation that can also be found in biological neural networks. Characterizing information preservation via complexity indices, we show how spatial synchronization allows rRNNs to reduce the negative impact of regular spontaneous dynamics on their computational performance.