Abstract

Most large space debris has large residual angular momentum, and the de-tumbling and capturing operation can easily cause instability and failure of tracking satellites. Therefore, it is necessary to perform real-time dynamic parameter identification of space debris prior to the imminent de-tumbling and capture operation, thus improving the efficiency and success of active debris removal (ADR) missions. A method for identifying dynamic parameters based on the fusion of visual and inertial data is proposed. To obtain the inertial data, the inertial measurement units (IMU) with light markers were fixed on the debris surface by space harpoon, which has been experimentally proven in space, and the binocular vision was placed at the front of a tracking satellite to obtain coordinates of the light markers. A novel method for denoising inertial data is proposed, which will eliminate the interference from the space environment. Furthermore, based on the denoised data and coordinates of the light markers, the mass-center location is estimated. The normalized angular momentum is calculated using the Euler–Poinsot motion characteristics, and all active debris removal parameters are determined. Simulations with Gaussian noise and experiments in the controlled laboratory have been conducted, the results indicate that this method can provide accurate dynamic parameters for the ADR mission.

Highlights

  • As human space activities increase, the condition of space debris gradually deteriorates, and the probability of collision with spacecrafts in orbit gradually increases [1,2,3,4].Among them, large space debris such as rocket upper stages will create a large number of new fragments after the collisions and disintegration, which seriously threatens the safety of orbiting spacecraft

  • To solve the shortcomings of existing methods, an inertial parameter estimation method of non-cooperative targets based on binocular vision and inertial measurement unit (IMU) fusion is proposed in this paper

  • Due to the disturbances in space such as the large temperature differences, radiation and magnetic interference, the inertial measurement unit often has a large amount of noise in the measurement data, which can greatly reduce the accuracy of the subsequent space debris dynamic parameters

Read more

Summary

Introduction

As human space activities increase, the condition of space debris gradually deteriorates, and the probability of collision with spacecrafts in orbit gradually increases [1,2,3,4]. Tweddle et al [7,8] proposed a method to estimate the dynamic parameters of non-cooperative targets based on binocular vision. To solve the shortcomings of existing methods, an inertial parameter estimation method of non-cooperative targets based on binocular vision and inertial measurement unit (IMU) fusion is proposed in this paper. This method combines the visualization of the visual method and the accuracy of inertial sensors’ measurement, can quickly and accurately estimate dynamic parameters of targets, and convert the parameters to the ADR coordinate system, which is beneficial to calculate direction and magnitude of de-tumbling and capture’s torque. The simulations and experiments of identifying dynamic parameters were carried out, and the results prove the real-time and high precision of our method

Feasibility Analysis of Space Harpoons
Coordinate Systems and Dynamic Model of Space Debris
Inertial Parameters Estimation Algorithm
Denoising Model of Redundant IMU Measurement Data
Mass-Center Location Estimation
Estimation of Normalized Angular Momentum
Simulations of Dynamic Parameters’ Identification
Experiments of Dynamic Parameters’ Identification
Findings
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.