Cargando…

Smooth Function Approximation by Deep Neural Networks with General Activation Functions

There has been a growing interest in expressivity of deep neural networks. However, most of the existing work about this topic focuses only on the specific activation function such as ReLU or sigmoid. In this paper, we investigate the approximation ability of deep neural networks with a broad class...

Descripción completa

Detalles Bibliográficos
Autores principales: Ohn, Ilsang, Kim, Yongdai
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7515121/
https://www.ncbi.nlm.nih.gov/pubmed/33267341
http://dx.doi.org/10.3390/e21070627
_version_ 1783586745954074624
author Ohn, Ilsang
Kim, Yongdai
author_facet Ohn, Ilsang
Kim, Yongdai
author_sort Ohn, Ilsang
collection PubMed
description There has been a growing interest in expressivity of deep neural networks. However, most of the existing work about this topic focuses only on the specific activation function such as ReLU or sigmoid. In this paper, we investigate the approximation ability of deep neural networks with a broad class of activation functions. This class of activation functions includes most of frequently used activation functions. We derive the required depth, width and sparsity of a deep neural network to approximate any Hölder smooth function upto a given approximation error for the large class of activation functions. Based on our approximation error analysis, we derive the minimax optimality of the deep neural network estimators with the general activation functions in both regression and classification problems.
format Online
Article
Text
id pubmed-7515121
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75151212020-11-09 Smooth Function Approximation by Deep Neural Networks with General Activation Functions Ohn, Ilsang Kim, Yongdai Entropy (Basel) Article There has been a growing interest in expressivity of deep neural networks. However, most of the existing work about this topic focuses only on the specific activation function such as ReLU or sigmoid. In this paper, we investigate the approximation ability of deep neural networks with a broad class of activation functions. This class of activation functions includes most of frequently used activation functions. We derive the required depth, width and sparsity of a deep neural network to approximate any Hölder smooth function upto a given approximation error for the large class of activation functions. Based on our approximation error analysis, we derive the minimax optimality of the deep neural network estimators with the general activation functions in both regression and classification problems. MDPI 2019-06-26 /pmc/articles/PMC7515121/ /pubmed/33267341 http://dx.doi.org/10.3390/e21070627 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Ohn, Ilsang
Kim, Yongdai
Smooth Function Approximation by Deep Neural Networks with General Activation Functions
title Smooth Function Approximation by Deep Neural Networks with General Activation Functions
title_full Smooth Function Approximation by Deep Neural Networks with General Activation Functions
title_fullStr Smooth Function Approximation by Deep Neural Networks with General Activation Functions
title_full_unstemmed Smooth Function Approximation by Deep Neural Networks with General Activation Functions
title_short Smooth Function Approximation by Deep Neural Networks with General Activation Functions
title_sort smooth function approximation by deep neural networks with general activation functions
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7515121/
https://www.ncbi.nlm.nih.gov/pubmed/33267341
http://dx.doi.org/10.3390/e21070627
work_keys_str_mv AT ohnilsang smoothfunctionapproximationbydeepneuralnetworkswithgeneralactivationfunctions
AT kimyongdai smoothfunctionapproximationbydeepneuralnetworkswithgeneralactivationfunctions