Cargando…
SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method
SAGRAD (Simulated Annealing GRADient), a Fortran 77 program for computing neural networks for classification using batch learning, is discussed. Neural network training in SAGRAD is based on a combination of simulated annealing and Møller’s scaled conjugate gradient algorithm, the latter a variation...
Autores principales: | Bernal, Javier, Torres-Jimenez, Jose |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
[Gaithersburg, MD] : U.S. Dept. of Commerce, National Institute of Standards and Technology
2015
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4730672/ https://www.ncbi.nlm.nih.gov/pubmed/26958442 http://dx.doi.org/10.6028/jres.120.009 |
Ejemplares similares
-
Neural Network Structure Optimization by Simulated Annealing
por: Kuo, Chun Lin, et al.
Publicado: (2022) -
Dynamic balance of a bipedal robot using neural network training with simulated annealing
por: Angeles-García, Yoqsan, et al.
Publicado: (2022) -
Multi-objective simulated annealing for hyper-parameter optimization in convolutional neural networks
por: Gülcü, Ayla, et al.
Publicado: (2021) -
Gradient Decomposition Methods for Training Neural Networks With Non-ideal Synaptic Devices
por: Zhao, Junyun, et al.
Publicado: (2021) -
Preconditioned Conjugate Gradient Methods
por: Axelsson, Owe, et al.
Publicado: (1990)