Cargando…

Accelerating Hyperparameter Optimization of Deep Neural Network via Progressive Multi-Fidelity Evaluation

Deep neural networks usually require careful tuning of hyperparameters to show their best performance. However, with the size of state-of-the-art neural networks growing larger, the evaluation cost of the traditional Bayesian optimization has become unacceptable in most cases. Moreover, most practic...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhu, Guanghui, Zhu, Ruancheng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7206157/
http://dx.doi.org/10.1007/978-3-030-47426-3_58

Ejemplares similares