Cargando…
Universal activation function for machine learning
This article proposes a universal activation function (UAF) that achieves near optimal performance in quantification, classification, and reinforcement learning (RL) problems. For any given problem, the gradient descent algorithms are able to evolve the UAF to a suitable activation function by tunin...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8455573/ https://www.ncbi.nlm.nih.gov/pubmed/34548504 http://dx.doi.org/10.1038/s41598-021-96723-8 |
_version_ | 1784570696119091200 |
---|---|
author | Yuen, Brosnan Hoang, Minh Tu Dong, Xiaodai Lu, Tao |
author_facet | Yuen, Brosnan Hoang, Minh Tu Dong, Xiaodai Lu, Tao |
author_sort | Yuen, Brosnan |
collection | PubMed |
description | This article proposes a universal activation function (UAF) that achieves near optimal performance in quantification, classification, and reinforcement learning (RL) problems. For any given problem, the gradient descent algorithms are able to evolve the UAF to a suitable activation function by tuning the UAF’s parameters. For the CIFAR-10 classification using the VGG-8 neural network, the UAF converges to the Mish like activation function, which has near optimal performance [Formula: see text] when compared to other activation functions. In the graph convolutional neural network on the CORA dataset, the UAF evolves to the identity function and obtains [Formula: see text] . For the quantification of simulated 9-gas mixtures in 30 dB signal-to-noise ratio (SNR) environments, the UAF converges to the identity function, which has near optimal root mean square error of [Formula: see text] . In the ZINC molecular solubility quantification using graph neural networks, the UAF morphs to a LeakyReLU/Sigmoid hybrid and achieves RMSE=[Formula: see text] . For the BipedalWalker-v2 RL dataset, the UAF achieves the 250 reward in [Formula: see text] epochs with a brand new activation function, which gives the fastest convergence rate among the activation functions. |
format | Online Article Text |
id | pubmed-8455573 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-84555732021-09-22 Universal activation function for machine learning Yuen, Brosnan Hoang, Minh Tu Dong, Xiaodai Lu, Tao Sci Rep Article This article proposes a universal activation function (UAF) that achieves near optimal performance in quantification, classification, and reinforcement learning (RL) problems. For any given problem, the gradient descent algorithms are able to evolve the UAF to a suitable activation function by tuning the UAF’s parameters. For the CIFAR-10 classification using the VGG-8 neural network, the UAF converges to the Mish like activation function, which has near optimal performance [Formula: see text] when compared to other activation functions. In the graph convolutional neural network on the CORA dataset, the UAF evolves to the identity function and obtains [Formula: see text] . For the quantification of simulated 9-gas mixtures in 30 dB signal-to-noise ratio (SNR) environments, the UAF converges to the identity function, which has near optimal root mean square error of [Formula: see text] . In the ZINC molecular solubility quantification using graph neural networks, the UAF morphs to a LeakyReLU/Sigmoid hybrid and achieves RMSE=[Formula: see text] . For the BipedalWalker-v2 RL dataset, the UAF achieves the 250 reward in [Formula: see text] epochs with a brand new activation function, which gives the fastest convergence rate among the activation functions. Nature Publishing Group UK 2021-09-21 /pmc/articles/PMC8455573/ /pubmed/34548504 http://dx.doi.org/10.1038/s41598-021-96723-8 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Yuen, Brosnan Hoang, Minh Tu Dong, Xiaodai Lu, Tao Universal activation function for machine learning |
title | Universal activation function for machine learning |
title_full | Universal activation function for machine learning |
title_fullStr | Universal activation function for machine learning |
title_full_unstemmed | Universal activation function for machine learning |
title_short | Universal activation function for machine learning |
title_sort | universal activation function for machine learning |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8455573/ https://www.ncbi.nlm.nih.gov/pubmed/34548504 http://dx.doi.org/10.1038/s41598-021-96723-8 |
work_keys_str_mv | AT yuenbrosnan universalactivationfunctionformachinelearning AT hoangminhtu universalactivationfunctionformachinelearning AT dongxiaodai universalactivationfunctionformachinelearning AT lutao universalactivationfunctionformachinelearning |