Capturing the dynamics of urban fire situation is a basic but challenging task, which takes an indispensable role in the field of urban security and fire emergency decision. Traditional methods approach the urban fire prediction via stochastic process based on physics or statistics, which may be interpretable but less practical in real applications. Recently, some data-driven models, Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and Graph Convolutional Neural Network (GCN), seem to be fruitful in capturing spatio-temporal dynamics with massive high-dimension data. In this paper, we process some regional urban fire dataset of recent six years in the fire situation awareness images (FSAIs) and extract pixel-level latent representations with CNNs while GCNs are applied to process some spatial graph structure auxiliary information to obtain graph-level latent representations. And then Urban Fire Situation Prediction Neural Network (UFSP-Net) is formulated as a novel urban fire prediction model, integrating these two different kind of spatial latent representations and RNN structure. In comparison to other traditional algorithms, such as Conv-RNN, UFSP-Net demonstrates its superior prediction performance for multi type urban fire in spatio-temporal scale.
Read full abstract