Abstract

Three-dimensional attitude estimation for unmanned aerial vehicles is usually based on the combination of magnetometer, accelerometer, and gyroscope (MARG). But MARG sensor can be easily affected by various disturbances, for example, vibration, external magnetic interference, and gyro drift. Optical flow sensor has the ability to extract motion information from image sequence, and thus, it is potential to augment three-dimensional attitude estimation for unmanned aerial vehicles. But the major problem is that the optical flow can be caused by both translational and rotational movements, which are difficult to be distinguished from each other. To solve the above problems, this article uses a gated recurrent unit neural network to implement data fusion for MARG and optical flow sensors, so as to enhance the accuracy of three-dimensional attitude estimation for unmanned aerial vehicles. The proposed algorithm can effectively make use of the attitude information contained in the optical flow measurements and can also achieve multi-sensor fusion for attitude estimation without explicit mathematical model. Compared with the commonly used extended Kalman filter algorithm for attitude estimation, the proposed algorithm shows higher accuracy in the flight test of quad-rotor unmanned aerial vehicles.

Highlights

  • Attitude estimation is the premise to ensure stable and controllable flight for unmanned aerial vehicles (UAV).[1,2,3] In recent years, attitude and heading reference systems (AHRS) are widely adopted in UAV flight control.[4]

  • With the continuous development of microelectro-mechanical system (MEMS) technology, MARG-based AHRS provides a low-cost solution for UAV attitude estimation

  • It is worth mentioning that the proposed gated recurrent unit (GRU)-based algorithm is evaluated in two cases, that is, working with and without optical flow sensor, while extended Kalman filter (EKF) performs typical MARG-based attitude estimation and does not make use of optical flow sensor

Read more

Summary

Introduction

Attitude estimation is the premise to ensure stable and controllable flight for unmanned aerial vehicles (UAV).[1,2,3] In recent years, attitude and heading reference systems (AHRS) are widely adopted in UAV flight control.[4]. GRU neural network can effectively utilize the measurements of MARG and optical flow sensors for attitude estimation, without the need of rigorous mathematical modeling. Compared to the attitude algorithms based on Elman neural network[31,32] the GRU-based attitude estimation method proposed in this article shows higher accuracy It is a feasible approach for data fusion. It is worth mentioning that the proposed GRU-based algorithm is evaluated in two cases, that is, working with and without optical flow sensor, while EKF performs typical MARG-based attitude estimation and does not make use of optical flow sensor. It can be seen that GRU-based algorithm needs shorter execution time than EKF, either with or without the aid of optical flow

Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.