Cargando…

Input correlations impede suppression of chaos and learning in balanced firing-rate networks

Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, ext...

Descripción completa

Detalles Bibliográficos
Autores principales: Engelken, Rainer, Ingrosso, Alessandro, Khajeh, Ramin, Goedeke, Sven, Abbott, L. F.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9754616/
https://www.ncbi.nlm.nih.gov/pubmed/36469504
http://dx.doi.org/10.1371/journal.pcbi.1010590
_version_ 1784851240143814656
author Engelken, Rainer
Ingrosso, Alessandro
Khajeh, Ramin
Goedeke, Sven
Abbott, L. F.
author_facet Engelken, Rainer
Ingrosso, Alessandro
Khajeh, Ramin
Goedeke, Sven
Abbott, L. F.
author_sort Engelken, Rainer
collection PubMed
description Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of internally-generated chaotic variability, strongly depends on correlations in the input. A distinctive feature of balanced networks is that, because common external input is dynamically canceled by recurrent feedback, it is far more difficult to suppress chaos with common input into each neuron than through independent input. To study this phenomenon, we develop a non-stationary dynamic mean-field theory for driven networks. The theory explains how the activity statistics and the largest Lyapunov exponent depend on the frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input. We further show that uncorrelated inputs facilitate learning in balanced networks.
format Online
Article
Text
id pubmed-9754616
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-97546162022-12-16 Input correlations impede suppression of chaos and learning in balanced firing-rate networks Engelken, Rainer Ingrosso, Alessandro Khajeh, Ramin Goedeke, Sven Abbott, L. F. PLoS Comput Biol Research Article Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of internally-generated chaotic variability, strongly depends on correlations in the input. A distinctive feature of balanced networks is that, because common external input is dynamically canceled by recurrent feedback, it is far more difficult to suppress chaos with common input into each neuron than through independent input. To study this phenomenon, we develop a non-stationary dynamic mean-field theory for driven networks. The theory explains how the activity statistics and the largest Lyapunov exponent depend on the frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input. We further show that uncorrelated inputs facilitate learning in balanced networks. Public Library of Science 2022-12-05 /pmc/articles/PMC9754616/ /pubmed/36469504 http://dx.doi.org/10.1371/journal.pcbi.1010590 Text en © 2022 Engelken et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Engelken, Rainer
Ingrosso, Alessandro
Khajeh, Ramin
Goedeke, Sven
Abbott, L. F.
Input correlations impede suppression of chaos and learning in balanced firing-rate networks
title Input correlations impede suppression of chaos and learning in balanced firing-rate networks
title_full Input correlations impede suppression of chaos and learning in balanced firing-rate networks
title_fullStr Input correlations impede suppression of chaos and learning in balanced firing-rate networks
title_full_unstemmed Input correlations impede suppression of chaos and learning in balanced firing-rate networks
title_short Input correlations impede suppression of chaos and learning in balanced firing-rate networks
title_sort input correlations impede suppression of chaos and learning in balanced firing-rate networks
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9754616/
https://www.ncbi.nlm.nih.gov/pubmed/36469504
http://dx.doi.org/10.1371/journal.pcbi.1010590
work_keys_str_mv AT engelkenrainer inputcorrelationsimpedesuppressionofchaosandlearninginbalancedfiringratenetworks
AT ingrossoalessandro inputcorrelationsimpedesuppressionofchaosandlearninginbalancedfiringratenetworks
AT khajehramin inputcorrelationsimpedesuppressionofchaosandlearninginbalancedfiringratenetworks
AT goedekesven inputcorrelationsimpedesuppressionofchaosandlearninginbalancedfiringratenetworks
AT abbottlf inputcorrelationsimpedesuppressionofchaosandlearninginbalancedfiringratenetworks