Abstract
Realizing a visual flying environment in various scenarios is an instrumental key to establishing visual hardware-in-the-loop (HIL) simulation for flying UAVs. To simulate a desired UAV flying mission for a specific building landing in arbitrary geodetic coordinates, a new approach is presented. The approach fulfills the UAV landing HIL via three key elements: Building, Training, and Servoing. Building a flying dynamic scene which is not limited to a certain scenario, but can be utilized for generating different aerial remote sensing scenes according to desired geodetic coordinates (location-free), flying mission, perspective angles, and approaching velocities. The dynamic scene is constructed via RHINO graphics software based on georeferenced images taken at different altitudes. Training a deep detection framework on the extracted dataset from each constructed scene to localize a specific building as a landing platform. Servoing a pan–tilt camera on an embedded system by a pure image-based visual servoing (IBVS) method to coincide the detected building center with the acquired frame center during the flight. Tiny-YOLOv4 is picked as a light detector that runs on Nvidia nano with a servo kit and achieves a satisfactory result even with two wind disturbances models in a real-time condition. Thus, verifying a successful UAV landing on a specific building approach is accomplished.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have