Abstract

Accurate identification of urban land-use patterns is essential to rational optimization of urban structure. By combining the external physical characteristics of city parcels obtained from remote sensing images and the socioeconomic attributes revealed by social sensing data, land use can be better classified. However, most of the existing social sensing data have location bias and lack temporal resolution, which cannot accurately reflect the socioeconomic information of land use and leads to low classification accuracy. Based on the above problems, this study explores the deep semantic information of high-spatial and temporal resolution time-series electricity data to explore its relationship with socioeconomic attributes and construct a neural network (TR-CNN) that can fuse time-series electricity data and remote sensing images to identify urban land-use types. We selected Anyuan District in Pingxiang City, Jiangxi Province for a demonstration study, and the results show that the accuracy of the proposed model is 0.934, which is 4.3% and 6.7% higher than that of the ResNet18 model using only remotely sensed images and the LSTM-FCN model using only time-series electricity data. The results also show that the use of time-series electricity data can effectively identify residential and commercial areas, but it is difficult to identify public service facilities compared with remote sensing images. This study finds for the first time that the semantic features of electricity data can fully reflect socioeconomic attributes and can accurately perceive urban land-use patterns from both “top-down” and “bottom-up” recognition patterns by coupling remote sensing images and electricity data.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.