Cargando…
Studying the Evolution of Neural Activation Patterns During Training of Feed-Forward ReLU Networks
The ability of deep neural networks to form powerful emergent representations of complex statistical patterns in data is as remarkable as imperfectly understood. For deep ReLU networks, these are encoded in the mixed discrete–continuous structure of linear weight matrices and non-linear binary activ...
Autores principales: | Hartmann, David, Franzen, Daniel, Brodehl, Sebastian |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8733739/ https://www.ncbi.nlm.nih.gov/pubmed/35005614 http://dx.doi.org/10.3389/frai.2021.642374 |
Ejemplares similares
-
Shallow Univariate ReLU Networks as Splines: Initialization, Loss Surface, Hessian, and Gradient Flow Dynamics
por: Sahs, Justin, et al.
Publicado: (2022) -
Integrating geometries of ReLU feedforward neural networks
por: Liu, Yajing, et al.
Publicado: (2023) -
Training a Two-Layer ReLU Network Analytically
por: Barbu, Adrian
Publicado: (2023) -
Improved Geometric Path Enumeration for Verifying ReLU Neural Networks
por: Bak, Stanley, et al.
Publicado: (2020) -
Multimodal transistors as ReLU activation functions in physical neural network classifiers
por: Surekcigil Pesch, Isin, et al.
Publicado: (2022)