Abstract

Unmanned Aircraft Systems (UAS) can significantly benefit from vision-based control when conventional sources of accurate position/orientation (pose) data (e.g. Global Positioning System/Inertial Measurement Unit) are not available. Here, a new paradigm to visual servoing is presented that is fundamentally different than the conventional paradigms; Position-Based Visual Servoing (PBVS) and Image-Based Visual Servoing (IBVS) approaches. With the new paradigm, measurement of the pose state variables of the UAS is not necessary. The image features are directly fedback to control the UAS's motion. However, these image features are directly related to the error of the pose of the UAS compared to a user-defined desired pose. The error in the pose of the UAS in Euclidean space is calculated in real-time. Then, a closed-loop model-based control law, which is aware of the UAS dynamics, uses these errors to control the UAS. So, unlike in IBVS methods, the presented approach does not generate undesirable motions for the vehicle in the Euclidean space. Also, the proposed method does not require numerical calculations. So, it is not as computationally expensive as the PBVS methods are. Using the proposed paradigm, an UAS can be controlled in real-time to move on any user-defined desired trajectory with respect to a fixed target by only using visual feedback. The approach is formulated in a general form for robotic vehicles with 6 Degrees-of-Freedom (DOFs). Then, as an example, it is simulated on for a quadrotor UAS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call