Cargando…
Learning on Arbitrary Graph Topologies via Predictive Coding
Training with backpropagation (BP) in standard deep learning consists of two main steps: a forward pass that maps a data point to its prediction, and a backward pass that propagates the error of this prediction back through the network. This process is highly effective when the goal is to minimize a...
Autores principales: | Salvatori, Tommaso, Pinchetti, Luca, Millidge, Beren, Song, Yuhang, Bao, Tianyi, Bogacz, Rafal, Lukasiewicz, Thomas |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7614467/ https://www.ncbi.nlm.nih.gov/pubmed/37090087 |
Ejemplares similares
-
Recurrent predictive coding models for associative memory employing covariance learning
por: Tang, Mufeng, et al.
Publicado: (2023) -
Hybrid predictive coding: Inferring, fast and slow
por: Tscshantz, Alexander, et al.
Publicado: (2023) -
Synthesis of Arbitrary Quantum Circuits to Topological Assembly
por: Paler, Alexandru, et al.
Publicado: (2016) -
Martingales and the fixation time of evolutionary graphs with arbitrary dimensionality
por: Monk, Travis, et al.
Publicado: (2022) -
How particular is the physics of the free energy principle?
por: Aguilera, Miguel, et al.
Publicado: (2022)