Cargando…

Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization

By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification proce...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhang, Chunyuan, Zhu, Qingxin, Niu, Xinzheng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi Publishing Corporation 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4942627/
https://www.ncbi.nlm.nih.gov/pubmed/27436996
http://dx.doi.org/10.1155/2016/2305854
_version_ 1782442448100786176
author Zhang, Chunyuan
Zhu, Qingxin
Niu, Xinzheng
author_facet Zhang, Chunyuan
Zhu, Qingxin
Niu, Xinzheng
author_sort Zhang, Chunyuan
collection PubMed
description By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L (2) and L (1) regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L (1) regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms.
format Online
Article
Text
id pubmed-4942627
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher Hindawi Publishing Corporation
record_format MEDLINE/PubMed
spelling pubmed-49426272016-07-19 Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization Zhang, Chunyuan Zhu, Qingxin Niu, Xinzheng Comput Intell Neurosci Research Article By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L (2) and L (1) regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L (1) regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms. Hindawi Publishing Corporation 2016 2016-06-29 /pmc/articles/PMC4942627/ /pubmed/27436996 http://dx.doi.org/10.1155/2016/2305854 Text en Copyright © 2016 Chunyuan Zhang et al. https://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Zhang, Chunyuan
Zhu, Qingxin
Niu, Xinzheng
Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization
title Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization
title_full Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization
title_fullStr Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization
title_full_unstemmed Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization
title_short Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization
title_sort kernel recursive least-squares temporal difference algorithms with sparsification and regularization
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4942627/
https://www.ncbi.nlm.nih.gov/pubmed/27436996
http://dx.doi.org/10.1155/2016/2305854
work_keys_str_mv AT zhangchunyuan kernelrecursiveleastsquarestemporaldifferencealgorithmswithsparsificationandregularization
AT zhuqingxin kernelrecursiveleastsquarestemporaldifferencealgorithmswithsparsificationandregularization
AT niuxinzheng kernelrecursiveleastsquarestemporaldifferencealgorithmswithsparsificationandregularization