Cargando…

Transformer-Based Maneuvering Target Tracking

When tracking maneuvering targets, recurrent neural networks (RNNs), especially long short-term memory (LSTM) networks, are widely applied to sequentially capture the motion states of targets from observations. However, LSTMs can only extract features of trajectories stepwise; thus, their modeling o...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhao, Guanghui, Wang, Zelin, Huang, Yixiong, Zhang, Huirong, Ma, Xiaojing
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9656253/
https://www.ncbi.nlm.nih.gov/pubmed/36366180
http://dx.doi.org/10.3390/s22218482
_version_ 1784829387786420224
author Zhao, Guanghui
Wang, Zelin
Huang, Yixiong
Zhang, Huirong
Ma, Xiaojing
author_facet Zhao, Guanghui
Wang, Zelin
Huang, Yixiong
Zhang, Huirong
Ma, Xiaojing
author_sort Zhao, Guanghui
collection PubMed
description When tracking maneuvering targets, recurrent neural networks (RNNs), especially long short-term memory (LSTM) networks, are widely applied to sequentially capture the motion states of targets from observations. However, LSTMs can only extract features of trajectories stepwise; thus, their modeling of maneuvering motion lacks globality. Meanwhile, trajectory datasets are often generated within a large, but fixed distance range. Therefore, the uncertainty of the initial position of targets increases the complexity of network training, and the fixed distance range reduces the generalization of the network to trajectories outside the dataset. In this study, we propose a transformer-based network (TBN) that consists of an encoder part (transformer layers) and a decoder part (one-dimensional convolutional layers), to track maneuvering targets. Assisted by the attention mechanism of the transformer network, the TBN can capture the long short-term dependencies of target states from a global perspective. Moreover, we propose a center–max normalization to reduce the complexity of TBN training and improve its generalization. The experimental results show that our proposed methods outperform the LSTM-based tracking network.
format Online
Article
Text
id pubmed-9656253
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-96562532022-11-15 Transformer-Based Maneuvering Target Tracking Zhao, Guanghui Wang, Zelin Huang, Yixiong Zhang, Huirong Ma, Xiaojing Sensors (Basel) Communication When tracking maneuvering targets, recurrent neural networks (RNNs), especially long short-term memory (LSTM) networks, are widely applied to sequentially capture the motion states of targets from observations. However, LSTMs can only extract features of trajectories stepwise; thus, their modeling of maneuvering motion lacks globality. Meanwhile, trajectory datasets are often generated within a large, but fixed distance range. Therefore, the uncertainty of the initial position of targets increases the complexity of network training, and the fixed distance range reduces the generalization of the network to trajectories outside the dataset. In this study, we propose a transformer-based network (TBN) that consists of an encoder part (transformer layers) and a decoder part (one-dimensional convolutional layers), to track maneuvering targets. Assisted by the attention mechanism of the transformer network, the TBN can capture the long short-term dependencies of target states from a global perspective. Moreover, we propose a center–max normalization to reduce the complexity of TBN training and improve its generalization. The experimental results show that our proposed methods outperform the LSTM-based tracking network. MDPI 2022-11-04 /pmc/articles/PMC9656253/ /pubmed/36366180 http://dx.doi.org/10.3390/s22218482 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Communication
Zhao, Guanghui
Wang, Zelin
Huang, Yixiong
Zhang, Huirong
Ma, Xiaojing
Transformer-Based Maneuvering Target Tracking
title Transformer-Based Maneuvering Target Tracking
title_full Transformer-Based Maneuvering Target Tracking
title_fullStr Transformer-Based Maneuvering Target Tracking
title_full_unstemmed Transformer-Based Maneuvering Target Tracking
title_short Transformer-Based Maneuvering Target Tracking
title_sort transformer-based maneuvering target tracking
topic Communication
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9656253/
https://www.ncbi.nlm.nih.gov/pubmed/36366180
http://dx.doi.org/10.3390/s22218482
work_keys_str_mv AT zhaoguanghui transformerbasedmaneuveringtargettracking
AT wangzelin transformerbasedmaneuveringtargettracking
AT huangyixiong transformerbasedmaneuveringtargettracking
AT zhanghuirong transformerbasedmaneuveringtargettracking
AT maxiaojing transformerbasedmaneuveringtargettracking