Cargando…
Efficient Underground Tunnel Place Recognition Algorithm Based on Farthest Point Subsampling and Dual-Attention Transformer
An autonomous place recognition system is essential for scenarios where GPS is useless, such as underground tunnels. However, it is difficult to use existing algorithms to fully utilize the small number of effective features in underground tunnel data, and recognition accuracy is difficult to guaran...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10675832/ https://www.ncbi.nlm.nih.gov/pubmed/38005647 http://dx.doi.org/10.3390/s23229261 |
_version_ | 1785149858673328128 |
---|---|
author | Chai, Xinghua Yang, Jianyong Yan, Xiangming Di, Chengliang Ye, Tao |
author_facet | Chai, Xinghua Yang, Jianyong Yan, Xiangming Di, Chengliang Ye, Tao |
author_sort | Chai, Xinghua |
collection | PubMed |
description | An autonomous place recognition system is essential for scenarios where GPS is useless, such as underground tunnels. However, it is difficult to use existing algorithms to fully utilize the small number of effective features in underground tunnel data, and recognition accuracy is difficult to guarantee. In order to solve this challenge, an efficient point cloud position recognition algorithm, named Dual-Attention Transformer Network (DAT-Net), is proposed in this paper. The algorithm firstly adopts the farthest point downsampling module to eliminate the invalid redundant points in the point cloud data and retain the basic shape of the point cloud, which reduces the size of the point cloud and, at the same time, reduces the influence of the invalid point cloud on the data analysis. After that, this paper proposes the dual-attention Transformer module to facilitate local information exchange by utilizing the multi-head self-attention mechanism. It extracts local descriptors and integrates highly discriminative global descriptors based on global context with the help of a feature fusion layer to obtain a more accurate and robust global feature representation. Experimental results show that the method proposed in this paper achieves an average [Formula: see text] score of 0.841 on the SubT-Tunnel dataset and outperforms many existing state-of-the-art algorithms in recognition accuracy and robustness tests. |
format | Online Article Text |
id | pubmed-10675832 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-106758322023-11-18 Efficient Underground Tunnel Place Recognition Algorithm Based on Farthest Point Subsampling and Dual-Attention Transformer Chai, Xinghua Yang, Jianyong Yan, Xiangming Di, Chengliang Ye, Tao Sensors (Basel) Article An autonomous place recognition system is essential for scenarios where GPS is useless, such as underground tunnels. However, it is difficult to use existing algorithms to fully utilize the small number of effective features in underground tunnel data, and recognition accuracy is difficult to guarantee. In order to solve this challenge, an efficient point cloud position recognition algorithm, named Dual-Attention Transformer Network (DAT-Net), is proposed in this paper. The algorithm firstly adopts the farthest point downsampling module to eliminate the invalid redundant points in the point cloud data and retain the basic shape of the point cloud, which reduces the size of the point cloud and, at the same time, reduces the influence of the invalid point cloud on the data analysis. After that, this paper proposes the dual-attention Transformer module to facilitate local information exchange by utilizing the multi-head self-attention mechanism. It extracts local descriptors and integrates highly discriminative global descriptors based on global context with the help of a feature fusion layer to obtain a more accurate and robust global feature representation. Experimental results show that the method proposed in this paper achieves an average [Formula: see text] score of 0.841 on the SubT-Tunnel dataset and outperforms many existing state-of-the-art algorithms in recognition accuracy and robustness tests. MDPI 2023-11-18 /pmc/articles/PMC10675832/ /pubmed/38005647 http://dx.doi.org/10.3390/s23229261 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Chai, Xinghua Yang, Jianyong Yan, Xiangming Di, Chengliang Ye, Tao Efficient Underground Tunnel Place Recognition Algorithm Based on Farthest Point Subsampling and Dual-Attention Transformer |
title | Efficient Underground Tunnel Place Recognition Algorithm Based on Farthest Point Subsampling and Dual-Attention Transformer |
title_full | Efficient Underground Tunnel Place Recognition Algorithm Based on Farthest Point Subsampling and Dual-Attention Transformer |
title_fullStr | Efficient Underground Tunnel Place Recognition Algorithm Based on Farthest Point Subsampling and Dual-Attention Transformer |
title_full_unstemmed | Efficient Underground Tunnel Place Recognition Algorithm Based on Farthest Point Subsampling and Dual-Attention Transformer |
title_short | Efficient Underground Tunnel Place Recognition Algorithm Based on Farthest Point Subsampling and Dual-Attention Transformer |
title_sort | efficient underground tunnel place recognition algorithm based on farthest point subsampling and dual-attention transformer |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10675832/ https://www.ncbi.nlm.nih.gov/pubmed/38005647 http://dx.doi.org/10.3390/s23229261 |
work_keys_str_mv | AT chaixinghua efficientundergroundtunnelplacerecognitionalgorithmbasedonfarthestpointsubsamplinganddualattentiontransformer AT yangjianyong efficientundergroundtunnelplacerecognitionalgorithmbasedonfarthestpointsubsamplinganddualattentiontransformer AT yanxiangming efficientundergroundtunnelplacerecognitionalgorithmbasedonfarthestpointsubsamplinganddualattentiontransformer AT dichengliang efficientundergroundtunnelplacerecognitionalgorithmbasedonfarthestpointsubsamplinganddualattentiontransformer AT yetao efficientundergroundtunnelplacerecognitionalgorithmbasedonfarthestpointsubsamplinganddualattentiontransformer |