Cargando…

HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks

As the performance of devices that conduct large-scale computations has been rapidly improved, various deep learning models have been successfully utilized in various applications. Particularly, convolution neural networks (CNN) have shown remarkable performance in image processing tasks such as ima...

Descripción completa

Detalles Bibliográficos
Autores principales: Kim, Kyung-Soo, Choi, Yong-Suk
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8231656/
https://www.ncbi.nlm.nih.gov/pubmed/34204695
http://dx.doi.org/10.3390/s21124054
_version_ 1783713475797712896
author Kim, Kyung-Soo
Choi, Yong-Suk
author_facet Kim, Kyung-Soo
Choi, Yong-Suk
author_sort Kim, Kyung-Soo
collection PubMed
description As the performance of devices that conduct large-scale computations has been rapidly improved, various deep learning models have been successfully utilized in various applications. Particularly, convolution neural networks (CNN) have shown remarkable performance in image processing tasks such as image classification and segmentation. Accordingly, more stable and robust optimization methods are required to effectively train them. However, the traditional optimizers used in deep learning still have unsatisfactory training performance for the models with many layers and weights. Accordingly, in this paper, we propose a new Adam-based hybrid optimization method called HyAdamC for training CNNs effectively. HyAdamC uses three new velocity control functions to adjust its search strength carefully in term of initial, short, and long-term velocities. Moreover, HyAdamC utilizes an adaptive coefficient computation method to prevent that a search direction determined by the first momentum is distorted by any outlier gradients. Then, these are combined into one hybrid method. In our experiments, HyAdamC showed not only notable test accuracies but also significantly stable and robust optimization abilities when training various CNN models. Furthermore, we also found that HyAdamC could be applied into not only image classification and image segmentation tasks.
format Online
Article
Text
id pubmed-8231656
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-82316562021-06-26 HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks Kim, Kyung-Soo Choi, Yong-Suk Sensors (Basel) Article As the performance of devices that conduct large-scale computations has been rapidly improved, various deep learning models have been successfully utilized in various applications. Particularly, convolution neural networks (CNN) have shown remarkable performance in image processing tasks such as image classification and segmentation. Accordingly, more stable and robust optimization methods are required to effectively train them. However, the traditional optimizers used in deep learning still have unsatisfactory training performance for the models with many layers and weights. Accordingly, in this paper, we propose a new Adam-based hybrid optimization method called HyAdamC for training CNNs effectively. HyAdamC uses three new velocity control functions to adjust its search strength carefully in term of initial, short, and long-term velocities. Moreover, HyAdamC utilizes an adaptive coefficient computation method to prevent that a search direction determined by the first momentum is distorted by any outlier gradients. Then, these are combined into one hybrid method. In our experiments, HyAdamC showed not only notable test accuracies but also significantly stable and robust optimization abilities when training various CNN models. Furthermore, we also found that HyAdamC could be applied into not only image classification and image segmentation tasks. MDPI 2021-06-12 /pmc/articles/PMC8231656/ /pubmed/34204695 http://dx.doi.org/10.3390/s21124054 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Kim, Kyung-Soo
Choi, Yong-Suk
HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
title HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
title_full HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
title_fullStr HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
title_full_unstemmed HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
title_short HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
title_sort hyadamc: a new adam-based hybrid optimization algorithm for convolution neural networks
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8231656/
https://www.ncbi.nlm.nih.gov/pubmed/34204695
http://dx.doi.org/10.3390/s21124054
work_keys_str_mv AT kimkyungsoo hyadamcanewadambasedhybridoptimizationalgorithmforconvolutionneuralnetworks
AT choiyongsuk hyadamcanewadambasedhybridoptimizationalgorithmforconvolutionneuralnetworks