Cargando…
Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks
Catastrophic forgetting, which means a rapid forgetting of learned representations while learning new data/samples, is one of the main problems of deep neural networks. In this paper, we propose a novel incremental learning framework that can address the forgetting problem by learning new incoming d...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10575012/ https://www.ncbi.nlm.nih.gov/pubmed/37836945 http://dx.doi.org/10.3390/s23198117 |
_version_ | 1785120823968792576 |
---|---|
author | Kim, Jonghong Lee, WonHee Baek, Sungdae Hong, Jeong-Ho Lee, Minho |
author_facet | Kim, Jonghong Lee, WonHee Baek, Sungdae Hong, Jeong-Ho Lee, Minho |
author_sort | Kim, Jonghong |
collection | PubMed |
description | Catastrophic forgetting, which means a rapid forgetting of learned representations while learning new data/samples, is one of the main problems of deep neural networks. In this paper, we propose a novel incremental learning framework that can address the forgetting problem by learning new incoming data in an online manner. We develop a new incremental learning framework that can learn extra data or new classes with less catastrophic forgetting. We adopt the hippocampal memory process to the deep neural networks by defining the effective maximum of neural activation and its boundary to represent a feature distribution. In addition, we incorporate incremental QR factorization into the deep neural networks to learn new data with both existing labels and new labels with less forgetting. The QR factorization can provide the accurate subspace prior, and incremental QR factorization can reasonably express the collaboration between new data with both existing classes and new class with less forgetting. In our framework, a set of appropriate features (i.e., nodes) provides improved representation for each class. We apply our method to the convolutional neural network (CNN) for learning Cifar-100 and Cifar-10 datasets. The experimental results show that the proposed method efficiently alleviates the stability and plasticity dilemma in the deep neural networks by providing the performance stability of a trained network while effectively learning unseen data and additional new classes. |
format | Online Article Text |
id | pubmed-10575012 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-105750122023-10-14 Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks Kim, Jonghong Lee, WonHee Baek, Sungdae Hong, Jeong-Ho Lee, Minho Sensors (Basel) Article Catastrophic forgetting, which means a rapid forgetting of learned representations while learning new data/samples, is one of the main problems of deep neural networks. In this paper, we propose a novel incremental learning framework that can address the forgetting problem by learning new incoming data in an online manner. We develop a new incremental learning framework that can learn extra data or new classes with less catastrophic forgetting. We adopt the hippocampal memory process to the deep neural networks by defining the effective maximum of neural activation and its boundary to represent a feature distribution. In addition, we incorporate incremental QR factorization into the deep neural networks to learn new data with both existing labels and new labels with less forgetting. The QR factorization can provide the accurate subspace prior, and incremental QR factorization can reasonably express the collaboration between new data with both existing classes and new class with less forgetting. In our framework, a set of appropriate features (i.e., nodes) provides improved representation for each class. We apply our method to the convolutional neural network (CNN) for learning Cifar-100 and Cifar-10 datasets. The experimental results show that the proposed method efficiently alleviates the stability and plasticity dilemma in the deep neural networks by providing the performance stability of a trained network while effectively learning unseen data and additional new classes. MDPI 2023-09-27 /pmc/articles/PMC10575012/ /pubmed/37836945 http://dx.doi.org/10.3390/s23198117 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Kim, Jonghong Lee, WonHee Baek, Sungdae Hong, Jeong-Ho Lee, Minho Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks |
title | Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks |
title_full | Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks |
title_fullStr | Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks |
title_full_unstemmed | Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks |
title_short | Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks |
title_sort | incremental learning for online data using qr factorization on convolutional neural networks |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10575012/ https://www.ncbi.nlm.nih.gov/pubmed/37836945 http://dx.doi.org/10.3390/s23198117 |
work_keys_str_mv | AT kimjonghong incrementallearningforonlinedatausingqrfactorizationonconvolutionalneuralnetworks AT leewonhee incrementallearningforonlinedatausingqrfactorizationonconvolutionalneuralnetworks AT baeksungdae incrementallearningforonlinedatausingqrfactorizationonconvolutionalneuralnetworks AT hongjeongho incrementallearningforonlinedatausingqrfactorizationonconvolutionalneuralnetworks AT leeminho incrementallearningforonlinedatausingqrfactorizationonconvolutionalneuralnetworks |