Cargando…

A New Hierarchical Temporal Memory Algorithm Based on Activation Intensity

As a human-cortex-inspired computing model, hierarchical temporal memory (HTM) has shown great promise in sequence learning and has been applied to various time-series applications. HTM uses the combination of columns and neurons to learn the temporal patterns within the sequence. However, the conve...

Descripción completa

Detalles Bibliográficos
Autores principales: Niu, Dejiao, Yang, Le, Cai, Tao, Li, Lei, Wu, Xudong, Wang, Zhidong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8803450/
https://www.ncbi.nlm.nih.gov/pubmed/35111211
http://dx.doi.org/10.1155/2022/6072316
_version_ 1784642872647090176
author Niu, Dejiao
Yang, Le
Cai, Tao
Li, Lei
Wu, Xudong
Wang, Zhidong
author_facet Niu, Dejiao
Yang, Le
Cai, Tao
Li, Lei
Wu, Xudong
Wang, Zhidong
author_sort Niu, Dejiao
collection PubMed
description As a human-cortex-inspired computing model, hierarchical temporal memory (HTM) has shown great promise in sequence learning and has been applied to various time-series applications. HTM uses the combination of columns and neurons to learn the temporal patterns within the sequence. However, the conventional HTM model compacts the input into two naive column states—active and nonactive, and uses a fixed learning strategy. This simplicity limits the representation capability of HTM and ignores the impacts of active columns on learning the temporal context. To address these issues, we propose a new HTM algorithm based on activation intensity. By introducing the column activation intensity, more useful and fine-grained information from the input is retained for sequence learning. Furthermore, a self-adaptive nonlinear learning strategy is proposed where the synaptic connections are dynamically adjusted according to the activation intensity of columns. Extensive experiments are carried out on two real-world time-series datasets. Compared to the conventional HTM and LSTM model, our method achieved higher accuracy and less time overhead.
format Online
Article
Text
id pubmed-8803450
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-88034502022-02-01 A New Hierarchical Temporal Memory Algorithm Based on Activation Intensity Niu, Dejiao Yang, Le Cai, Tao Li, Lei Wu, Xudong Wang, Zhidong Comput Intell Neurosci Research Article As a human-cortex-inspired computing model, hierarchical temporal memory (HTM) has shown great promise in sequence learning and has been applied to various time-series applications. HTM uses the combination of columns and neurons to learn the temporal patterns within the sequence. However, the conventional HTM model compacts the input into two naive column states—active and nonactive, and uses a fixed learning strategy. This simplicity limits the representation capability of HTM and ignores the impacts of active columns on learning the temporal context. To address these issues, we propose a new HTM algorithm based on activation intensity. By introducing the column activation intensity, more useful and fine-grained information from the input is retained for sequence learning. Furthermore, a self-adaptive nonlinear learning strategy is proposed where the synaptic connections are dynamically adjusted according to the activation intensity of columns. Extensive experiments are carried out on two real-world time-series datasets. Compared to the conventional HTM and LSTM model, our method achieved higher accuracy and less time overhead. Hindawi 2022-01-24 /pmc/articles/PMC8803450/ /pubmed/35111211 http://dx.doi.org/10.1155/2022/6072316 Text en Copyright © 2022 Dejiao Niu et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Niu, Dejiao
Yang, Le
Cai, Tao
Li, Lei
Wu, Xudong
Wang, Zhidong
A New Hierarchical Temporal Memory Algorithm Based on Activation Intensity
title A New Hierarchical Temporal Memory Algorithm Based on Activation Intensity
title_full A New Hierarchical Temporal Memory Algorithm Based on Activation Intensity
title_fullStr A New Hierarchical Temporal Memory Algorithm Based on Activation Intensity
title_full_unstemmed A New Hierarchical Temporal Memory Algorithm Based on Activation Intensity
title_short A New Hierarchical Temporal Memory Algorithm Based on Activation Intensity
title_sort new hierarchical temporal memory algorithm based on activation intensity
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8803450/
https://www.ncbi.nlm.nih.gov/pubmed/35111211
http://dx.doi.org/10.1155/2022/6072316
work_keys_str_mv AT niudejiao anewhierarchicaltemporalmemoryalgorithmbasedonactivationintensity
AT yangle anewhierarchicaltemporalmemoryalgorithmbasedonactivationintensity
AT caitao anewhierarchicaltemporalmemoryalgorithmbasedonactivationintensity
AT lilei anewhierarchicaltemporalmemoryalgorithmbasedonactivationintensity
AT wuxudong anewhierarchicaltemporalmemoryalgorithmbasedonactivationintensity
AT wangzhidong anewhierarchicaltemporalmemoryalgorithmbasedonactivationintensity
AT niudejiao newhierarchicaltemporalmemoryalgorithmbasedonactivationintensity
AT yangle newhierarchicaltemporalmemoryalgorithmbasedonactivationintensity
AT caitao newhierarchicaltemporalmemoryalgorithmbasedonactivationintensity
AT lilei newhierarchicaltemporalmemoryalgorithmbasedonactivationintensity
AT wuxudong newhierarchicaltemporalmemoryalgorithmbasedonactivationintensity
AT wangzhidong newhierarchicaltemporalmemoryalgorithmbasedonactivationintensity