Cargando…
Accelerating Hyperparameter Optimization of Deep Neural Network via Progressive Multi-Fidelity Evaluation
Deep neural networks usually require careful tuning of hyperparameters to show their best performance. However, with the size of state-of-the-art neural networks growing larger, the evaluation cost of the traditional Bayesian optimization has become unacceptable in most cases. Moreover, most practic...
Autores principales: | Zhu, Guanghui, Zhu, Ruancheng |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7206157/ http://dx.doi.org/10.1007/978-3-030-47426-3_58 |
Ejemplares similares
-
Bayesian Multi-objective Hyperparameter Optimization for Accurate, Fast, and Efficient Neural Network Accelerator Design
por: Parsa, Maryam, et al.
Publicado: (2020) -
Parsimonious Optimization of Multitask Neural Network Hyperparameters
por: Valsecchi, Cecile, et al.
Publicado: (2021) -
Fully Convolutional Deep Neural Networks with Optimized Hyperparameters for Detection of Shockable and Non-Shockable Rhythms
por: Krasteva, Vessela, et al.
Publicado: (2020) -
Optimization of convolutional neural network hyperparameters for automatic classification of adult mosquitoes
por: Motta, Daniel, et al.
Publicado: (2020) -
Neural network hyperparameter optimization for prediction of real estate prices in Helsinki
por: Kalliola, Jussi, et al.
Publicado: (2021)