Abstract

To solve the problem that external systematic errors of the optical camera cannot be fully estimated due to limited computing resources, a unified dimensionality reduction representation method for the external systematic errors of the optical camera is proposed, and autonomous relative optical navigation is realized. The camera translational and misalignment errors are converted into a three-dimensional rotation error, whose differential model can be established through specific attitude control and appropriate assumption. Then, the rotation error and the relative motion state are jointly estimated in an augmented Kalman filter framework. Compared with the traditional method that estimates the camera translational and misalignment errors, the proposed method reduces the computational complexity in that the estimated state dimension is reduced. Furthermore, as demonstrated by numerical simulation, the estimation accuracy is improved significantly.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call