Cargando…
Streaming chunk incremental learning for class-wise data stream classification with fast learning speed and low structural complexity
Due to the fast speed of data generation and collection from advanced equipment, the amount of data obviously overflows the limit of available memory space and causes difficulties achieving high learning accuracy. Several methods based on discard-after-learn concept have been proposed. Some methods...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6733468/ https://www.ncbi.nlm.nih.gov/pubmed/31498787 http://dx.doi.org/10.1371/journal.pone.0220624 |
_version_ | 1783449990058737664 |
---|---|
author | Junsawang, Prem Phimoltares, Suphakant Lursinsap, Chidchanok |
author_facet | Junsawang, Prem Phimoltares, Suphakant Lursinsap, Chidchanok |
author_sort | Junsawang, Prem |
collection | PubMed |
description | Due to the fast speed of data generation and collection from advanced equipment, the amount of data obviously overflows the limit of available memory space and causes difficulties achieving high learning accuracy. Several methods based on discard-after-learn concept have been proposed. Some methods were designed to cope with a single incoming datum but some were designed for a chunk of incoming data. Although the results of these approaches are rather impressive, most of them are based on temporally adding more neurons to learn new incoming data without any neuron merging process which can obviously increase the computational time and space complexities. Only online versatile elliptic basis function (VEBF) introduced neuron merging to reduce the space-time complexity of learning only a single incoming datum. This paper proposed a method for further enhancing the capability of discard-after-learn concept for streaming data-chunk environment in terms of low computational time and neural space complexities. A set of recursive functions for computing the relevant parameters of a new neuron, based on statistical confidence interval, was introduced. The newly proposed method, named streaming chunk incremental learning (SCIL), increases the plasticity and the adaptabilty of the network structure according to the distribution of incoming data and their classes. When being compared to the others in incremental-like manner, based on 11 benchmarked data sets of 150 to 581,012 samples with attributes ranging from 4 to 1,558 formed as streaming data, the proposed SCIL gave better accuracy and time in most data sets. |
format | Online Article Text |
id | pubmed-6733468 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-67334682019-09-20 Streaming chunk incremental learning for class-wise data stream classification with fast learning speed and low structural complexity Junsawang, Prem Phimoltares, Suphakant Lursinsap, Chidchanok PLoS One Research Article Due to the fast speed of data generation and collection from advanced equipment, the amount of data obviously overflows the limit of available memory space and causes difficulties achieving high learning accuracy. Several methods based on discard-after-learn concept have been proposed. Some methods were designed to cope with a single incoming datum but some were designed for a chunk of incoming data. Although the results of these approaches are rather impressive, most of them are based on temporally adding more neurons to learn new incoming data without any neuron merging process which can obviously increase the computational time and space complexities. Only online versatile elliptic basis function (VEBF) introduced neuron merging to reduce the space-time complexity of learning only a single incoming datum. This paper proposed a method for further enhancing the capability of discard-after-learn concept for streaming data-chunk environment in terms of low computational time and neural space complexities. A set of recursive functions for computing the relevant parameters of a new neuron, based on statistical confidence interval, was introduced. The newly proposed method, named streaming chunk incremental learning (SCIL), increases the plasticity and the adaptabilty of the network structure according to the distribution of incoming data and their classes. When being compared to the others in incremental-like manner, based on 11 benchmarked data sets of 150 to 581,012 samples with attributes ranging from 4 to 1,558 formed as streaming data, the proposed SCIL gave better accuracy and time in most data sets. Public Library of Science 2019-09-09 /pmc/articles/PMC6733468/ /pubmed/31498787 http://dx.doi.org/10.1371/journal.pone.0220624 Text en © 2019 Junsawang et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Junsawang, Prem Phimoltares, Suphakant Lursinsap, Chidchanok Streaming chunk incremental learning for class-wise data stream classification with fast learning speed and low structural complexity |
title | Streaming chunk incremental learning for class-wise data stream classification with fast learning speed and low structural complexity |
title_full | Streaming chunk incremental learning for class-wise data stream classification with fast learning speed and low structural complexity |
title_fullStr | Streaming chunk incremental learning for class-wise data stream classification with fast learning speed and low structural complexity |
title_full_unstemmed | Streaming chunk incremental learning for class-wise data stream classification with fast learning speed and low structural complexity |
title_short | Streaming chunk incremental learning for class-wise data stream classification with fast learning speed and low structural complexity |
title_sort | streaming chunk incremental learning for class-wise data stream classification with fast learning speed and low structural complexity |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6733468/ https://www.ncbi.nlm.nih.gov/pubmed/31498787 http://dx.doi.org/10.1371/journal.pone.0220624 |
work_keys_str_mv | AT junsawangprem streamingchunkincrementallearningforclasswisedatastreamclassificationwithfastlearningspeedandlowstructuralcomplexity AT phimoltaressuphakant streamingchunkincrementallearningforclasswisedatastreamclassificationwithfastlearningspeedandlowstructuralcomplexity AT lursinsapchidchanok streamingchunkincrementallearningforclasswisedatastreamclassificationwithfastlearningspeedandlowstructuralcomplexity |