Cargando…

Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding

Understanding of human intention by observing a series of human actions has been a challenging task. In order to do so, we need to analyze longer sequences of human actions related with intentions and extract the context from the dynamic features. The multiple timescales recurrent neural network (MT...

Descripción completa

Detalles Bibliográficos
Autores principales: Yu, Zhibin, Moirangthem, Dennis S., Lee, Minho
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5572368/
https://www.ncbi.nlm.nih.gov/pubmed/28878646
http://dx.doi.org/10.3389/fnbot.2017.00042
_version_ 1783259513578586112
author Yu, Zhibin
Moirangthem, Dennis S.
Lee, Minho
author_facet Yu, Zhibin
Moirangthem, Dennis S.
Lee, Minho
author_sort Yu, Zhibin
collection PubMed
description Understanding of human intention by observing a series of human actions has been a challenging task. In order to do so, we need to analyze longer sequences of human actions related with intentions and extract the context from the dynamic features. The multiple timescales recurrent neural network (MTRNN) model, which is believed to be a kind of solution, is a useful tool for recording and regenerating a continuous signal for dynamic tasks. However, the conventional MTRNN suffers from the vanishing gradient problem which renders it impossible to be used for longer sequence understanding. To address this problem, we propose a new model named Continuous Timescale Long-Short Term Memory (CTLSTM) in which we inherit the multiple timescales concept into the Long-Short Term Memory (LSTM) recurrent neural network (RNN) that addresses the vanishing gradient problem. We design an additional recurrent connection in the LSTM cell outputs to produce a time-delay in order to capture the slow context. Our experiments show that the proposed model exhibits better context modeling ability and captures the dynamic features on multiple large dataset classification tasks. The results illustrate that the multiple timescales concept enhances the ability of our model to handle longer sequences related with human intentions and hence proving to be more suitable for complex tasks, such as intention recognition.
format Online
Article
Text
id pubmed-5572368
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-55723682017-09-06 Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding Yu, Zhibin Moirangthem, Dennis S. Lee, Minho Front Neurorobot Neuroscience Understanding of human intention by observing a series of human actions has been a challenging task. In order to do so, we need to analyze longer sequences of human actions related with intentions and extract the context from the dynamic features. The multiple timescales recurrent neural network (MTRNN) model, which is believed to be a kind of solution, is a useful tool for recording and regenerating a continuous signal for dynamic tasks. However, the conventional MTRNN suffers from the vanishing gradient problem which renders it impossible to be used for longer sequence understanding. To address this problem, we propose a new model named Continuous Timescale Long-Short Term Memory (CTLSTM) in which we inherit the multiple timescales concept into the Long-Short Term Memory (LSTM) recurrent neural network (RNN) that addresses the vanishing gradient problem. We design an additional recurrent connection in the LSTM cell outputs to produce a time-delay in order to capture the slow context. Our experiments show that the proposed model exhibits better context modeling ability and captures the dynamic features on multiple large dataset classification tasks. The results illustrate that the multiple timescales concept enhances the ability of our model to handle longer sequences related with human intentions and hence proving to be more suitable for complex tasks, such as intention recognition. Frontiers Media S.A. 2017-08-23 /pmc/articles/PMC5572368/ /pubmed/28878646 http://dx.doi.org/10.3389/fnbot.2017.00042 Text en Copyright © 2017 Yu, Moirangthem and Lee. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Yu, Zhibin
Moirangthem, Dennis S.
Lee, Minho
Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
title Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
title_full Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
title_fullStr Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
title_full_unstemmed Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
title_short Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding
title_sort continuous timescale long-short term memory neural network for human intent understanding
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5572368/
https://www.ncbi.nlm.nih.gov/pubmed/28878646
http://dx.doi.org/10.3389/fnbot.2017.00042
work_keys_str_mv AT yuzhibin continuoustimescalelongshorttermmemoryneuralnetworkforhumanintentunderstanding
AT moirangthemdenniss continuoustimescalelongshorttermmemoryneuralnetworkforhumanintentunderstanding
AT leeminho continuoustimescalelongshorttermmemoryneuralnetworkforhumanintentunderstanding