Perceiving the horizon line is a critical alternative for unmanned aerial vehicle (UAV) autonomous navigation, especially in the presence of noise-induced drift, unavailability of satellite navigation, and multipath errors. However, its quite tough to detect the horizon line, due to the remotely sensed big data, the dynamic changes in flight, and the serious consequences of failure. To address these problems, we propose a graph-based horizon line detection technique that is composed of graph-based image segmentation, connected domain cascade filtering, horizon line extraction, and UAV attitude estimation. We improve the graph-based image segmentation algorithm so that the segmentation results are more conducive to horizon line detection. We then determine the sky-component by cascade filtering and extract the horizon line based on the boundaries of the sky-component. Furthermore, we directly compute the roll and pitch according to the extracted horizon line and eliminate the ambiguity of the angles. To validate our approach qualitatively and quantitatively, we designed a fixed-wing UAV system. We then validated our algorithm through extensive flights under various conditions and compared the estimated rolls and pitches to the IMU ones. Statistical results show that the proposed technique provides unbiased attitude angles with error variance of about 2, which verify the validity and robustness of our method. For engineering, our program runs at approximately 60 frames per second on the fly after optimizing.
Read full abstract