SU-Net: Pose estimation network for non-cooperative spacecraft on-orbit

21 Feb 2023  ·  Hu Gao, Zhihui Li, Depeng Dang, Ning Wang, Jingfan Yang ·

Spacecraft pose estimation plays a vital role in many on-orbit space missions, such as rendezvous and docking, debris removal, and on-orbit maintenance. At present, space images contain widely varying lighting conditions, high contrast and low resolution, pose estimation of space objects is more challenging than that of objects on earth. In this paper, we analyzing the radar image characteristics of spacecraft on-orbit, then propose a new deep learning neural Network structure named Dense Residual U-shaped Network (DR-U-Net) to extract image features. We further introduce a novel neural network based on DR-U-Net, namely Spacecraft U-shaped Network (SU-Net) to achieve end-to-end pose estimation for non-cooperative spacecraft. Specifically, the SU-Net first preprocess the image of non-cooperative spacecraft, then transfer learning was used for pre-training. Subsequently, in order to solve the problem of radar image blur and low ability of spacecraft contour recognition, we add residual connection and dense connection to the backbone network U-Net, and we named it DR-U-Net. In this way, the feature loss and the complexity of the model is reduced, and the degradation of deep neural network during training is avoided. Finally, a layer of feedforward neural network is used for pose estimation of non-cooperative spacecraft on-orbit. Experiments prove that the proposed method does not rely on the hand-made object specific features, and the model has robust robustness, and the calculation accuracy outperforms the state-of-the-art pose estimation methods. The absolute error is 0.1557 to 0.4491 , the mean error is about 0.302 , and the standard deviation is about 0.065 .

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods