Cargando…
Smooth Function Approximation by Deep Neural Networks with General Activation Functions
There has been a growing interest in expressivity of deep neural networks. However, most of the existing work about this topic focuses only on the specific activation function such as ReLU or sigmoid. In this paper, we investigate the approximation ability of deep neural networks with a broad class...
Autores principales: | Ohn, Ilsang, Kim, Yongdai |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7515121/ https://www.ncbi.nlm.nih.gov/pubmed/33267341 http://dx.doi.org/10.3390/e21070627 |
Ejemplares similares
-
Smoothing and approximation of functions
por: Shapiro, Harold S
Publicado: (1969) -
Analytic Function Approximation by Path-Norm-Regularized Deep Neural Networks
por: Beknazaryan, Aleksandr
Publicado: (2022) -
Fast Approximations of Activation Functions in Deep Neural Networks when using Posit Arithmetic
por: Cococcioni, Marco, et al.
Publicado: (2020) -
Approximation of Bivariate Functions via Smooth Extensions
por: Zhang, Zhihua
Publicado: (2014) -
Parallel Frequency Function-Deep Neural Network for Efficient Approximation of Complex Broadband Signals
por: Zeng, Zhi, et al.
Publicado: (2022)