Cargando…
Learning in Feedforward Neural Networks Accelerated by Transfer Entropy
Current neural networks architectures are many times harder to train because of the increasing size and complexity of the used datasets. Our objective is to design more efficient training algorithms utilizing causal relationships inferred from neural networks. The transfer entropy (TE) was initially...
Autores principales: | Moldovan, Adrian, Caţaron, Angel, Andonie, Răzvan |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7516405/ https://www.ncbi.nlm.nih.gov/pubmed/33285877 http://dx.doi.org/10.3390/e22010102 |
Ejemplares similares
-
Learning in Convolutional Neural Networks Accelerated by Transfer Entropy
por: Moldovan, Adrian, et al.
Publicado: (2021) -
Transfer Information Energy: A Quantitative Indicator of Information Transfer between Time Series
por: Caţaron, Angel, et al.
Publicado: (2018) -
Synaptic convergence regulates synchronization-dependent spike transfer in feedforward neural networks
por: Sailamul, Pachaya, et al.
Publicado: (2017) -
Semiotic Aggregation in Deep Learning
por: Muşat, Bogdan, et al.
Publicado: (2020) -
Integrating geometries of ReLU feedforward neural networks
por: Liu, Yajing, et al.
Publicado: (2023)