Cargando…

Continual Sequence Modeling With Predictive Coding

Recurrent neural networks (RNNs) have been proved very successful at modeling sequential data such as language or motions. However, these successes rely on the use of the backpropagation through time (BPTT) algorithm, batch training, and the hypothesis that all the training data are available at the...

Descripción completa

Detalles Bibliográficos
Autores principales: Annabi, Louis, Pitti, Alexandre, Quoy, Mathias
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9171436/
https://www.ncbi.nlm.nih.gov/pubmed/35686118
http://dx.doi.org/10.3389/fnbot.2022.845955
_version_ 1784721665501954048
author Annabi, Louis
Pitti, Alexandre
Quoy, Mathias
author_facet Annabi, Louis
Pitti, Alexandre
Quoy, Mathias
author_sort Annabi, Louis
collection PubMed
description Recurrent neural networks (RNNs) have been proved very successful at modeling sequential data such as language or motions. However, these successes rely on the use of the backpropagation through time (BPTT) algorithm, batch training, and the hypothesis that all the training data are available at the same time. In contrast, the field of developmental robotics aims at uncovering lifelong learning mechanisms that could allow embodied machines to learn and stabilize knowledge in continuously evolving environments. In this article, we investigate different RNN designs and learning methods, that we evaluate in a continual learning setting. The generative modeling task consists in learning to generate 20 continuous trajectories that are presented sequentially to the learning algorithms. Each method is evaluated according to the average prediction error over the 20 trajectories obtained after complete training. This study focuses on learning algorithms with low memory requirements, that do not need to store past information to update their parameters. Our experiments identify two approaches especially fit for this task: conceptors and predictive coding. We suggest combining these two mechanisms into a new proposed model that we label PC-Conceptors that outperforms the other methods presented in this study.
format Online
Article
Text
id pubmed-9171436
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-91714362022-06-08 Continual Sequence Modeling With Predictive Coding Annabi, Louis Pitti, Alexandre Quoy, Mathias Front Neurorobot Neuroscience Recurrent neural networks (RNNs) have been proved very successful at modeling sequential data such as language or motions. However, these successes rely on the use of the backpropagation through time (BPTT) algorithm, batch training, and the hypothesis that all the training data are available at the same time. In contrast, the field of developmental robotics aims at uncovering lifelong learning mechanisms that could allow embodied machines to learn and stabilize knowledge in continuously evolving environments. In this article, we investigate different RNN designs and learning methods, that we evaluate in a continual learning setting. The generative modeling task consists in learning to generate 20 continuous trajectories that are presented sequentially to the learning algorithms. Each method is evaluated according to the average prediction error over the 20 trajectories obtained after complete training. This study focuses on learning algorithms with low memory requirements, that do not need to store past information to update their parameters. Our experiments identify two approaches especially fit for this task: conceptors and predictive coding. We suggest combining these two mechanisms into a new proposed model that we label PC-Conceptors that outperforms the other methods presented in this study. Frontiers Media S.A. 2022-05-23 /pmc/articles/PMC9171436/ /pubmed/35686118 http://dx.doi.org/10.3389/fnbot.2022.845955 Text en Copyright © 2022 Annabi, Pitti and Quoy. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Annabi, Louis
Pitti, Alexandre
Quoy, Mathias
Continual Sequence Modeling With Predictive Coding
title Continual Sequence Modeling With Predictive Coding
title_full Continual Sequence Modeling With Predictive Coding
title_fullStr Continual Sequence Modeling With Predictive Coding
title_full_unstemmed Continual Sequence Modeling With Predictive Coding
title_short Continual Sequence Modeling With Predictive Coding
title_sort continual sequence modeling with predictive coding
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9171436/
https://www.ncbi.nlm.nih.gov/pubmed/35686118
http://dx.doi.org/10.3389/fnbot.2022.845955
work_keys_str_mv AT annabilouis continualsequencemodelingwithpredictivecoding
AT pittialexandre continualsequencemodelingwithpredictivecoding
AT quoymathias continualsequencemodelingwithpredictivecoding