Cargando…

An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent

An improved Adam optimization algorithm combining adaptive coefficients and composite gradients based on randomized block coordinate descent is proposed to address issues of the Adam algorithm such as slow convergence, the tendency to miss the global optimal solution, and the ineffectiveness of proc...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Miaomiao, Yao, Dan, Liu, Zhigang, Guo, Jingfeng, Chen, Jing
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9845049/
https://www.ncbi.nlm.nih.gov/pubmed/36660559
http://dx.doi.org/10.1155/2023/4765891
_version_ 1784870803703070720
author Liu, Miaomiao
Yao, Dan
Liu, Zhigang
Guo, Jingfeng
Chen, Jing
author_facet Liu, Miaomiao
Yao, Dan
Liu, Zhigang
Guo, Jingfeng
Chen, Jing
author_sort Liu, Miaomiao
collection PubMed
description An improved Adam optimization algorithm combining adaptive coefficients and composite gradients based on randomized block coordinate descent is proposed to address issues of the Adam algorithm such as slow convergence, the tendency to miss the global optimal solution, and the ineffectiveness of processing high-dimensional vectors. The adaptive coefficient is used to adjust the gradient deviation value and correct the search direction firstly. Then, the predicted gradient is introduced, and the current gradient and the first-order momentum are combined to form a composite gradient to improve the global optimization ability. Finally, the random block coordinate method is used to determine the gradient update mode, which reduces the computational overhead. Simulation experiments on two standard datasets for classification show that the convergence speed and accuracy of the proposed algorithm are higher than those of the six gradient descent methods, and the CPU and memory utilization are significantly reduced. In addition, based on logging data, the BP neural networks optimized by six algorithms, respectively, are used to predict reservoir porosity. Results show that the proposed method has lower system overhead, higher accuracy, and stronger stability, and the absolute error of more than 86% data is within 0.1%, which further verifies its effectiveness.
format Online
Article
Text
id pubmed-9845049
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-98450492023-01-18 An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent Liu, Miaomiao Yao, Dan Liu, Zhigang Guo, Jingfeng Chen, Jing Comput Intell Neurosci Research Article An improved Adam optimization algorithm combining adaptive coefficients and composite gradients based on randomized block coordinate descent is proposed to address issues of the Adam algorithm such as slow convergence, the tendency to miss the global optimal solution, and the ineffectiveness of processing high-dimensional vectors. The adaptive coefficient is used to adjust the gradient deviation value and correct the search direction firstly. Then, the predicted gradient is introduced, and the current gradient and the first-order momentum are combined to form a composite gradient to improve the global optimization ability. Finally, the random block coordinate method is used to determine the gradient update mode, which reduces the computational overhead. Simulation experiments on two standard datasets for classification show that the convergence speed and accuracy of the proposed algorithm are higher than those of the six gradient descent methods, and the CPU and memory utilization are significantly reduced. In addition, based on logging data, the BP neural networks optimized by six algorithms, respectively, are used to predict reservoir porosity. Results show that the proposed method has lower system overhead, higher accuracy, and stronger stability, and the absolute error of more than 86% data is within 0.1%, which further verifies its effectiveness. Hindawi 2023-01-10 /pmc/articles/PMC9845049/ /pubmed/36660559 http://dx.doi.org/10.1155/2023/4765891 Text en Copyright © 2023 Miaomiao Liu et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Liu, Miaomiao
Yao, Dan
Liu, Zhigang
Guo, Jingfeng
Chen, Jing
An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent
title An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent
title_full An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent
title_fullStr An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent
title_full_unstemmed An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent
title_short An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent
title_sort improved adam optimization algorithm combining adaptive coefficients and composite gradients based on randomized block coordinate descent
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9845049/
https://www.ncbi.nlm.nih.gov/pubmed/36660559
http://dx.doi.org/10.1155/2023/4765891
work_keys_str_mv AT liumiaomiao animprovedadamoptimizationalgorithmcombiningadaptivecoefficientsandcompositegradientsbasedonrandomizedblockcoordinatedescent
AT yaodan animprovedadamoptimizationalgorithmcombiningadaptivecoefficientsandcompositegradientsbasedonrandomizedblockcoordinatedescent
AT liuzhigang animprovedadamoptimizationalgorithmcombiningadaptivecoefficientsandcompositegradientsbasedonrandomizedblockcoordinatedescent
AT guojingfeng animprovedadamoptimizationalgorithmcombiningadaptivecoefficientsandcompositegradientsbasedonrandomizedblockcoordinatedescent
AT chenjing animprovedadamoptimizationalgorithmcombiningadaptivecoefficientsandcompositegradientsbasedonrandomizedblockcoordinatedescent
AT liumiaomiao improvedadamoptimizationalgorithmcombiningadaptivecoefficientsandcompositegradientsbasedonrandomizedblockcoordinatedescent
AT yaodan improvedadamoptimizationalgorithmcombiningadaptivecoefficientsandcompositegradientsbasedonrandomizedblockcoordinatedescent
AT liuzhigang improvedadamoptimizationalgorithmcombiningadaptivecoefficientsandcompositegradientsbasedonrandomizedblockcoordinatedescent
AT guojingfeng improvedadamoptimizationalgorithmcombiningadaptivecoefficientsandcompositegradientsbasedonrandomizedblockcoordinatedescent
AT chenjing improvedadamoptimizationalgorithmcombiningadaptivecoefficientsandcompositegradientsbasedonrandomizedblockcoordinatedescent