Abstract

ABSTRACT Traffic estimation from very-high-resolution remote-sensing imagery has received increasing interest during the last few years. In this article, we propose an automatic system for estimation of the annual average daily traffic (AADT) using very-high-resolution optical remote-sensing imagery of urban areas in combination with high-quality, but very spatially limited, ground-based measurements. The main part of the system is the vehicle detection, which is based on the deep learning object detection architecture mask region-based convolutional neural network (Mask R-CNN), modified with an image normalization strategy to make it more robust for test images of various conditions and the use of a precise road mask to assist the filtering of driving vehicles from parked ones. Furthermore, to include the high-quality ground-based measurements and to make the traffic estimates more consistent across neighbouring road links, we propose a graph smoothing strategy that utilizes the road network. The fully automatic processing chain has been validated on a set of aerial images covering the city of Narvik, Norway. The precision and recall rate of detecting driving vehicles was 0.74 and 0.66, respectively, and the AADT was estimated with a root mean squared error (RMSE) of 2279 and bias of −383. We conclude that separating driving vehicles from parked ones may be challenging if vehicles are parked along the roads and that for urban environment with short road links several remote-sensing images covering the road links at different time instances are necessary in order to benefit from the remote-sensing images.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.