Cargando…

SU-Net: pose estimation network for non-cooperative spacecraft on-orbit

The estimation of spacecraft pose is crucial in numerous space missions, including rendezvous and docking, debris removal, and on-orbit maintenance. Estimating the pose of space objects is significantly more challenging than that of objects on Earth, primarily due to the widely varying lighting cond...

Descripción completa

Detalles Bibliográficos
Autores principales: Gao, Hu, Li, Zhihui, Wang, Ning, Yang, Jingfan, Dang, Depeng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10361969/
https://www.ncbi.nlm.nih.gov/pubmed/37479871
http://dx.doi.org/10.1038/s41598-023-38974-1
_version_ 1785076317565222912
author Gao, Hu
Li, Zhihui
Wang, Ning
Yang, Jingfan
Dang, Depeng
author_facet Gao, Hu
Li, Zhihui
Wang, Ning
Yang, Jingfan
Dang, Depeng
author_sort Gao, Hu
collection PubMed
description The estimation of spacecraft pose is crucial in numerous space missions, including rendezvous and docking, debris removal, and on-orbit maintenance. Estimating the pose of space objects is significantly more challenging than that of objects on Earth, primarily due to the widely varying lighting conditions, low resolution, and limited amount of data available in space images. Our main proposal is a new deep learning neural network architecture, which can effectively extract orbiting spacecraft features from images captured by inverse synthetic aperture radar (ISAR) for pose estimation of non-cooperative on orbit spacecraft. Specifically, our model enhances spacecraft imaging by improving image contrast, reducing noise, and using transfer learning to mitigate data sparsity issues via a pre-trained model. To address sparse features in spacecraft imaging, we propose a dense residual U-Net network that employs dense residual block to reduce feature loss during downsampling. Additionally, we introduce a multi-head self-attention block to capture more global information and improve the model’s accuracy. The resulting tightly interlinked architecture, named as SU-Net, delivers strong performance gains on pose estimation by spacecraft ISAR imaging. Experimental results show that we achieve the state of the art results, and the absolute error of our model is 0.128[Formula: see text] to 0.4491[Formula: see text] , the mean error is about 0.282[Formula: see text] , and the standard deviation is about 0.065[Formula: see text] . The code are released at https://github.com/Tombs98/SU-Net.
format Online
Article
Text
id pubmed-10361969
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-103619692023-07-23 SU-Net: pose estimation network for non-cooperative spacecraft on-orbit Gao, Hu Li, Zhihui Wang, Ning Yang, Jingfan Dang, Depeng Sci Rep Article The estimation of spacecraft pose is crucial in numerous space missions, including rendezvous and docking, debris removal, and on-orbit maintenance. Estimating the pose of space objects is significantly more challenging than that of objects on Earth, primarily due to the widely varying lighting conditions, low resolution, and limited amount of data available in space images. Our main proposal is a new deep learning neural network architecture, which can effectively extract orbiting spacecraft features from images captured by inverse synthetic aperture radar (ISAR) for pose estimation of non-cooperative on orbit spacecraft. Specifically, our model enhances spacecraft imaging by improving image contrast, reducing noise, and using transfer learning to mitigate data sparsity issues via a pre-trained model. To address sparse features in spacecraft imaging, we propose a dense residual U-Net network that employs dense residual block to reduce feature loss during downsampling. Additionally, we introduce a multi-head self-attention block to capture more global information and improve the model’s accuracy. The resulting tightly interlinked architecture, named as SU-Net, delivers strong performance gains on pose estimation by spacecraft ISAR imaging. Experimental results show that we achieve the state of the art results, and the absolute error of our model is 0.128[Formula: see text] to 0.4491[Formula: see text] , the mean error is about 0.282[Formula: see text] , and the standard deviation is about 0.065[Formula: see text] . The code are released at https://github.com/Tombs98/SU-Net. Nature Publishing Group UK 2023-07-21 /pmc/articles/PMC10361969/ /pubmed/37479871 http://dx.doi.org/10.1038/s41598-023-38974-1 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Gao, Hu
Li, Zhihui
Wang, Ning
Yang, Jingfan
Dang, Depeng
SU-Net: pose estimation network for non-cooperative spacecraft on-orbit
title SU-Net: pose estimation network for non-cooperative spacecraft on-orbit
title_full SU-Net: pose estimation network for non-cooperative spacecraft on-orbit
title_fullStr SU-Net: pose estimation network for non-cooperative spacecraft on-orbit
title_full_unstemmed SU-Net: pose estimation network for non-cooperative spacecraft on-orbit
title_short SU-Net: pose estimation network for non-cooperative spacecraft on-orbit
title_sort su-net: pose estimation network for non-cooperative spacecraft on-orbit
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10361969/
https://www.ncbi.nlm.nih.gov/pubmed/37479871
http://dx.doi.org/10.1038/s41598-023-38974-1
work_keys_str_mv AT gaohu sunetposeestimationnetworkfornoncooperativespacecraftonorbit
AT lizhihui sunetposeestimationnetworkfornoncooperativespacecraftonorbit
AT wangning sunetposeestimationnetworkfornoncooperativespacecraftonorbit
AT yangjingfan sunetposeestimationnetworkfornoncooperativespacecraftonorbit
AT dangdepeng sunetposeestimationnetworkfornoncooperativespacecraftonorbit