Abstract

In on-orbit services, the relative position and attitude estimation of non-cooperative spacecraft has become the key issues to be solved in many space missions. Because of the lack of prior knowledge about manual marks and the inability to communicate between non-cooperative space targets, the relative position and attitude estimation system poses great challenges in terms of accuracy, intelligence, and power consumptions. To address these issues, this study uses a stereo camera to extract the feature points of a non-cooperative spacecraft. Then, the 3D position of the feature points is calculated according to the camera model to estimate the relationship. The optical flow method is also used to obtain the geometric constraint information between frames. In addition, an extended Kalman filter is used to update the measurement results and obtain more accurate pose optimization results. Moreover, we present a closed-loop simulation system, in which the Unity simulation engine is employed to simulate the relative motion of the spacecraft and binocular vision images, and a JetsonTX2 supercomputer is involved to conduct the proposed autonomous relative navigation algorithm. The simulation results show that our approach can estimate the non-cooperative target’s relative pose accurately.

Highlights

  • Nowadays, the space application capacity of various countries has been continuously improved

  • Lichter [14] developed a method for simultaneously estimating dynamic state, model parameters, and geometric shape of arbitrary space targets, using information gathered from range imaging sensors

  • In the non-cooperative targets (NCTs) autonomous navigation algorithm, the relative position, velocity, attitude and angular velocity of the target spacecraft relative to the left camera system are calculated according to the binocular image

Read more

Summary

Introduction

The space application capacity of various countries has been continuously improved. Qiu et al [10] calculated the relative measurement of a non-cooperative spacecraft by using a visible light angle measuring camera and a laser rangefinder simultaneously They designed filters for the two sensors. Lichter [14] developed a method for simultaneously estimating dynamic state, model parameters, and geometric shape of arbitrary space targets, using information gathered from range imaging sensors. Segal et al [20] extracted the feature points of a target spacecraft, calculated the relative posture as the state quantity by using the observation model of a binocular camera, and optimized the relative measurement using the iterative EKF method. Ωttc, and the error relative to the bit velocity is ∆xp = xp − xp According to this equation, when the tracking spacecraft has and only has orbital angular velocity, it satisfies Rtc Ic−1 Tc − ωcci× Icωcci = 0, . When the sampling time is short, the first-order approximate covariance matrix of discrete time is ∆tGQGT, and the state transition matrix is Φk+1,k ≈ I12×12 + F11∆t

Measurement Model
Calculation in NCT Autonomous Navigation Algorithm
Filter Initiation
Filter Navigation Performance
Result
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call