Abstract
Roadside noise barriers (RNBs) are important urban infrastructures to develop a liveable city. However, the absence of accurate and large-scale geospatial data on RNBs has impeded the increasing progress of rational urban planning, sustainable cities, and healthy environments. To address this problem, this study proposes a geospatial artificial intelligence framework to create a vectorized RNB dataset in China using street view imagery. To begin, intensive sampling is performed on the road network of each city based on OpenStreetMap, which is used as the geo-reference to download 5.6 million Baidu Street View (BSV) images. Furthermore, considering the prior geographic knowledge contained in street view images, convolutional neural networks incorporating image context information (IC-CNNs) based on an ensemble learning strategy are developed to detect RNBs from the BSV images. Subsequently, the RNB dataset presented by polylines is generated based on the identified RNB locations, with a total length of 2,227 km in 215 cities. At last, the quality of the RNB dataset is evaluated from two perspectives: first, the detection accuracy; second, the completeness and positional accuracy. Specifically, based on a set of randomly selected samples containing 10,000 BSV images, four quantitative metrics are calculated, with an overall accuracy of 98.61 %, recall of 87.14 %, precision of 76.44 %, and F1-score of 81.44 %. Moreover, a total length of 254 km of roads in different cities are manually surveyed using BSV images to evaluate the mileage deviation and overlap level between the generated and surveyed RNBs. The root-mean-squared error for mileage deviation is 0.08 km, and the intersection over union for overlay level is 88.08 % ± 2.95 %. The evaluation results suggest that the generated RNB dataset is of high quality and can be applied as an accurate and reliable dataset for a variety of large-scale urban studies. The generated vectorized RNB dataset and the labelled BSV image benchmark dataset are publicly available at https://doi.org/10.11888/Others.tpdc.271914 (Chen, 2021).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.