Abstract

Conventional spacecraft Guidance, Navigation, and Control (GNC) architectures have been designed to receive and execute commands from ground control with minimal automation and autonomy onboard spacecraft. In contrast, Artificial Intelligence (AI)-based systems can allow real-time decision-making by considering system information that is difficult to model and incorporate in the conventional decision-making process involving ground control or human operators. With growing interests in on-orbit services with manipulation, the conventional GNC faces numerous challenges in adapting to a wide range of possible scenarios, such as removing unknown debris, potentially addressed using emerging AI-enabled robotic technologies. However, a complete paradigm shift may need years' efforts. As an intermediate solution, we introduce a novel visual GNC system with two state-of-the-art AI modules to replace the corresponding functions in the conventional GNC system for on-orbit manipulation. The AI components are as follows: (i) A Deep Learning (DL)-based pose estimation algorithm that can estimate a target's pose from two-dimensional images using a pre-trained neural network without requiring any prior information on the dynamics or state of the target. (ii) A technique for modeling and controlling space robot manipulator trajectories using probabilistic modeling and reproduction to previously unseen situations to avoid complex trajectory optimizations on board. This also minimizes the attitude disturbances of spacecraft induced on it due to the motion of the robot arm. This architecture uses a centralized camera network as the main sensor, and the trajectory learning module of the 7 degrees of freedom robotic arm is integrated into the GNC system. The intelligent visual GNC system is demonstrated by simulation of a conceptual mission—AISAT. The mission is a micro-satellite to carry out on-orbit manipulation around a non-cooperative CubeSat. The simulation shows how the GNC system works in discrete-time simulation with the control and trajectory planning are generated in Matlab/Simulink. The physics rendering engine, Eevee, renders the whole simulation to provide a graphic realism for the DL pose estimation. In the end, the testbeds developed to evaluate and demonstrate the GNC system are also introduced. The novel intelligent GNC system can be a stepping stone toward future fully autonomous orbital robot systems.

Highlights

  • With the increasing interest in orbital robotic manipulation for on-orbit servicing missions, the future Guidance, Navigation, and Control (GNC) systems are inclined to implement intelligent on-board systems with greater autonomy to address the challenges associated with the time-critical and highly unpredictable scenarios

  • To perform relative pose estimation for rendezvous with the non-cooperative spacecraft, we developed two approaches, non-keypoint-based approach presented in Proenca and Gao (2020) and keypoint-based approach presented in Rathinam and Gao (2020)

  • During the pose estimation, we can set a higher threshold for object detection and keypoint estimation, through which lower accuracy estimations can be filtered out to maintain more reliable estimated poses

Read more

Summary

INTRODUCTION

In contrast to the conventional GNC systems where the control systems are usually managed by different subsystems with limited data exchange and decision-making, the intelligent GNC system can exploit all sensor information to use pre-trained neural networks to deliver the optimized control results Such a system demands the standardized communication networks across all sensors, and a more powerful logic unit for handling raw sensor data and processing globally optimized control schemes to simultaneously manage orbital control, spacecraft attitude control, and the robotic arm control. From this comparison, the major technical challenges for the intelligent GNC architecture have been identified are as follows: (1) low computational power hardware in space radiation environment and limited energy supply; (2) robustness of the pre-trained neural networks for real space missions.

10 Go back to 4
CONCLUSION AND FUTURE WORKS
DATA AVAILABILITY STATEMENT
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call