Cargando…

Adaptive model training strategy for continuous classification of time series

The classification of time series is essential in many real-world applications like healthcare. The class of a time series is usually labeled at the final time, but more and more time-sensitive applications require classifying time series continuously. For example, the outcome of a critical patient...

Descripción completa

Detalles Bibliográficos
Autores principales: Sun, Chenxi, Li, Hongyan, Song, Moxian, Cai, Derun, Zhang, Baofeng, Hong, Shenda
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer US 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9922045/
https://www.ncbi.nlm.nih.gov/pubmed/36819946
http://dx.doi.org/10.1007/s10489-022-04433-z
_version_ 1784887458014429184
author Sun, Chenxi
Li, Hongyan
Song, Moxian
Cai, Derun
Zhang, Baofeng
Hong, Shenda
author_facet Sun, Chenxi
Li, Hongyan
Song, Moxian
Cai, Derun
Zhang, Baofeng
Hong, Shenda
author_sort Sun, Chenxi
collection PubMed
description The classification of time series is essential in many real-world applications like healthcare. The class of a time series is usually labeled at the final time, but more and more time-sensitive applications require classifying time series continuously. For example, the outcome of a critical patient is only determined at the end, but he should be diagnosed at all times for timely treatment. For this demand, we propose a new concept, Continuous Classification of Time Series (CCTS). Different from the existing single-shot classification, the key of CCTS is to model multiple distributions simultaneously due to the dynamic evolution of time series. But the deep learning model will encounter intertwined problems of catastrophic forgetting and over-fitting when learning multi-distribution. In this work, we found that the well-designed distribution division and replay strategies in the model training process can help to solve the problems. We propose a novel Adaptive model training strategy for CCTS (ACCTS). Its adaptability represents two aspects: (1) Adaptive multi-distribution extraction policy. Instead of the fixed rules and the prior knowledge, ACCTS extracts data distributions adaptive to the time series evolution and the model change; (2) Adaptive importance-based replay policy. Instead of reviewing all old distributions, ACCTS only replays important samples adaptive to their contribution to the model. Experiments on four real-world datasets show that our method outperforms all baselines.
format Online
Article
Text
id pubmed-9922045
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Springer US
record_format MEDLINE/PubMed
spelling pubmed-99220452023-02-13 Adaptive model training strategy for continuous classification of time series Sun, Chenxi Li, Hongyan Song, Moxian Cai, Derun Zhang, Baofeng Hong, Shenda Appl Intell (Dordr) Article The classification of time series is essential in many real-world applications like healthcare. The class of a time series is usually labeled at the final time, but more and more time-sensitive applications require classifying time series continuously. For example, the outcome of a critical patient is only determined at the end, but he should be diagnosed at all times for timely treatment. For this demand, we propose a new concept, Continuous Classification of Time Series (CCTS). Different from the existing single-shot classification, the key of CCTS is to model multiple distributions simultaneously due to the dynamic evolution of time series. But the deep learning model will encounter intertwined problems of catastrophic forgetting and over-fitting when learning multi-distribution. In this work, we found that the well-designed distribution division and replay strategies in the model training process can help to solve the problems. We propose a novel Adaptive model training strategy for CCTS (ACCTS). Its adaptability represents two aspects: (1) Adaptive multi-distribution extraction policy. Instead of the fixed rules and the prior knowledge, ACCTS extracts data distributions adaptive to the time series evolution and the model change; (2) Adaptive importance-based replay policy. Instead of reviewing all old distributions, ACCTS only replays important samples adaptive to their contribution to the model. Experiments on four real-world datasets show that our method outperforms all baselines. Springer US 2023-02-11 /pmc/articles/PMC9922045/ /pubmed/36819946 http://dx.doi.org/10.1007/s10489-022-04433-z Text en © The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023, Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.
spellingShingle Article
Sun, Chenxi
Li, Hongyan
Song, Moxian
Cai, Derun
Zhang, Baofeng
Hong, Shenda
Adaptive model training strategy for continuous classification of time series
title Adaptive model training strategy for continuous classification of time series
title_full Adaptive model training strategy for continuous classification of time series
title_fullStr Adaptive model training strategy for continuous classification of time series
title_full_unstemmed Adaptive model training strategy for continuous classification of time series
title_short Adaptive model training strategy for continuous classification of time series
title_sort adaptive model training strategy for continuous classification of time series
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9922045/
https://www.ncbi.nlm.nih.gov/pubmed/36819946
http://dx.doi.org/10.1007/s10489-022-04433-z
work_keys_str_mv AT sunchenxi adaptivemodeltrainingstrategyforcontinuousclassificationoftimeseries
AT lihongyan adaptivemodeltrainingstrategyforcontinuousclassificationoftimeseries
AT songmoxian adaptivemodeltrainingstrategyforcontinuousclassificationoftimeseries
AT caiderun adaptivemodeltrainingstrategyforcontinuousclassificationoftimeseries
AT zhangbaofeng adaptivemodeltrainingstrategyforcontinuousclassificationoftimeseries
AT hongshenda adaptivemodeltrainingstrategyforcontinuousclassificationoftimeseries