Abstract

Forest fires are a kind of natural hazard with a high number of occurrences in southern European countries. To avoid major damages and to improve forest fire management, one can use forest fire spread simulators to predict fire behavior. When providing forest fire predictions, there are two main considerations: accuracy and computation time. In the context of natural hazards simulation, it is well known that part of the final forecast error comes from uncertainty in the input data. These data typically consist of a set of GIS files, which should be appropriately conflated. For this reason, several input data calibration methods have been developed by the scientific community. In this work, the Two-Stage calibration methodology, which has been shown to provide good results, is used. This calibration strategy is computationally intensive and time-consuming because it uses a Genetic Algorithm as a solution. Taking into account the aspect of urgency in forest fire spread prediction, it is necessary to maintain a balance between accuracy and the time needed to calibrate the input parameters. In order to take advantage of this technique, one must deal with the problem that some of the obtained solutions are impractical, since they involve simulation times that are too long, preventing the prediction system from being deployed at an operational level. A new method which finds the minimum resolution reduction for such long simulations, keeping accuracy loss to a known interval, is proposed. The proposed improvement is based on a time-aware core allocation policy that enables real-time forest fire spread forecasting. The final prediction system is a cyberinfrastructure, which enables forest fire spread prediction at real time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call