Abstract

Urban poverty is a major obstacle to the healthy development of urbanization. Identifying and mapping urban poverty is of great significance to sustainable urban development. Traditional data and methods cannot measure urban poverty at a fine scale. Besides, existing studies often ignore the impact of the built environment and fail to consider the equal importance of poverty indicators. The emerging multi-source big data provide new opportunities for accurately measuring and monitoring urban poverty. This study aims to map urban poverty spatial at a fine scale by using multi-source big data, including social sensing and remote sensing data. The urban core of Zhengzhou is selected as the study area. The characteristics of the community’s living environment are quantified by accessibility, block vitality, per unit rent, public service infrastructure, and socio-economic factors. The urban poverty spatial index (SI) model is constructed by using the multiplier index of the factors. The SOM clustering method is employed to identify urban poverty space based on the developed SI. The performance of the proposed SI model is evaluated at the neighborhood scale. The results show that the urban poverty spatial measurement method based on multi-source big data can capture spatial patterns of typical urban poverty with relatively high accuracy. Compared with the urban poverty space measured based on remote sensing data, it considers the built environment and socio-economic factors in the identification of the inner city poverty space, and avoids being affected by the texture information of the physical surface of the residential area and the external structure of the buildings. Overall, this study can provide a comprehensive, cost-effective, and efficient method for the refined management of urban poverty space and the improvement of built environment quality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call