Abstract

The performance of optical flow estimation is affected by many factors. The impact of illumination changes and video quality degradation in unmanned aerial vehicles videos on optical flow estimation cannot be ignored. Inspired by the human retina’s visual adaptation mechanism, we propose a mechanism for illumination adjustment that imitates retinal processing in order to reduce illumination variation. We further introduce an edge refinement mechanism into optical flow estimation that is based on a weighted neighborhood filtering. The experimental results on public benchmarks inlcuding KITTI 2012, KITTI 2015, MPI Sintel and Middlebury show that the proposed approach is robust on illumination and preserves accurate motion details. Further, experiments on outdoor livestock UAV videos show that the proposed approach implements illumination robustness and preserves the accurate detection of motion edges in other types of video. The performance of the proposed method on public benchmarks and livestock UAV videos demonstrates that the proposed approach improves motion edge accuracy of optical flow fields in varying illumination.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call