Abstract

Conventional surveillance devices are deployed at fixed locations on road sideways, poles or on traffic lights, which provide a constant and fixed surveillance view of the urban traffic. Unmanned aerial vehicles (UAVs) have for last two decades received considerable attention in building smart and effective system with wider coverage using low cost, highly flexible unmanned platform for smart city infrastructure. Unlike fixed monitoring devices, the camera platform of aerial vehicles has many constraints, as it is in constant motion including titling and panning, and thus makes it difficult to process data for real time applications. The inaccuracy in object detection rates from UAV videos has motivated the research community to combine different approaches such as optical flow and supervised learning algorithms. The method proposed in this research incorporates steps that include Kanade-Lucas optical flow method for moving object detection, building connected graphs to isolate objects and convolutional neural network (CNN), followed by support vector machine (SVM) for final classification. The generated optical flow contains background (and tiny) objects detected as vehicle as the camera platform moves. The classifier introduced here rules out the presence of any other (moving) objects to be detected as vehicles. The methodology adopted is tested on a stationary and moving aerial videos. The system is shown to have performance accuracy of 100% in case of stationary video and 98% in case of video from aerial platform.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call