Cargando…
Fine-Tuning and the Stability of Recurrent Neural Networks
A central criticism of standard theoretical approaches to constructing stable, recurrent model networks is that the synaptic connection weights need to be finely-tuned. This criticism is severe because proposed rules for learning these weights have been shown to have various limitations to their bio...
Autores principales: | MacNeil, David, Eliasmith, Chris |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2011
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3181247/ https://www.ncbi.nlm.nih.gov/pubmed/21980334 http://dx.doi.org/10.1371/journal.pone.0022885 |
Ejemplares similares
-
Optimizing Semantic Pointer Representations for Symbol-Like Processing in Spiking Neural Networks
por: Gosmann, Jan, et al.
Publicado: (2016) -
Using Neural Networks to Generate Inferential Roles for Natural Language
por: Blouw, Peter, et al.
Publicado: (2018) -
Automatic Optimization of the Computation Graph in the Nengo Neural Network Simulator
por: Gosmann, Jan, et al.
Publicado: (2017) -
Spiking neural networks fine-tuning for brain image segmentation
por: Yue, Ye, et al.
Publicado: (2023) -
Fine-tuning of a generative neural network for designing multi-target compounds
por: Blaschke, Thomas, et al.
Publicado: (2021)