Abstract

Vision-based control of Unmanned Aerial Vehicles (UAV) is gaining a global interest. Recent quantum leaps in the development of fast image acquisition–processing tools are making vision sensors omnipresent. Information obtained from the on-board imaging sensor of a UAV can be used for mapping the environment, localizing the UAV, visual odometry, and tracking pre-specified trajectories and (or) way points. The information obtained through vision sensors can be either fused with those obtained from a GPS or can be used in GPS-deprived scenarios such as indoor applications. A vision-based control strategy based on an adaptive prediction, planning, and execution framework is proposed with the objective to smoothly servo–track an object in near optimal time. A class of C2 continuous quintic polynomial based trajectories is planned at a higher level first taking the maximum permissible acceleration of the flyer into account. At a lower level, a Linear Quadratic regulator is used to track the planned trajectory. The replanning is carried out under two conditions: (i) when the flyer fails in tracking the planned trajectory closely, or (ii) the target object to track starts moving. This framework was tested on a 2 degrees of freedom model helicopter equipped with an on-board pinhole perspective camera via simulations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call