Cargando…
Transferring Learning from External to Internal Weights in Echo-State Networks with Sparse Connectivity
Modifying weights within a recurrent network to improve performance on a task has proven to be difficult. Echo-state networks in which modification is restricted to the weights of connections onto network outputs provide an easier alternative, but at the expense of modifying the typically sparse arc...
Autores principales: | Sussillo, David, Abbott, L.F. |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2012
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3360031/ https://www.ncbi.nlm.nih.gov/pubmed/22655041 http://dx.doi.org/10.1371/journal.pone.0037372 |
Ejemplares similares
-
Sparse balance: Excitatory-inhibitory networks with small bias currents and broadly distributed synaptic weights
por: Khajeh, Ramin, et al.
Publicado: (2022) -
Sparse adaptive filters for echo cancellation
por: Paleologu, Constantin, et al.
Publicado: (2011) -
Tailoring Echo State Networks for Optimal Learning
por: Aceituno, Pau Vilimelis, et al.
Publicado: (2020) -
Dendritic normalisation improves learning in sparsely connected artificial neural networks
por: Bird, Alex D., et al.
Publicado: (2021) -
Selectivity and Sparseness in Randomly Connected Balanced Networks
por: Pehlevan, Cengiz, et al.
Publicado: (2014)