Abstract

Timely and accurate population mapping plays an essential role in a wide range of critical applications. Benefiting from the emergence of multi-source geospatial datasets and the development of spatial statistics and machine learning, multi-scale population mapping with high temporal resolutions has been made possible. However, the over-complex models and the strict data requirement resulting from the constant quest for increased accuracy pose challenges to the repeatability of many population spatialization frameworks. Therefore, in this study, using limited publicly available datasets and an automatic ensemble learning model (AutoGluon), we presented an efficient framework to simplify the model training and prediction process. The proposed framework was applied to estimate county-level population density in China and received a good result with an r2 of 0.974 and an RMSD of 427.61, which is better than the performances of current mainstream population mapping frameworks in terms of estimation accuracy. Furthermore, the derived monthly population maps and the revealed spatial pattern of population dynamics in China are consistent with earlier studies, suggesting the robustness of the proposed framework in cross-time mapping. To our best knowledge, this study is the first work to apply AutoGluon in population mapping, and the framework’s efficient and automated modeling capabilities will contribute to larger-scale and finer spatial-temporal population spatialization studies.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.