Abstract

This paper presents a novel system for automatic forest-fire measurement using cameras distributed at ground stations and mounted on Unmanned Aerial Systems (UAS). It can obtain geometrical measurements of forest fires in real-time such as the location and shape of the fire front, flame height and rate of spread, among others. Measurement of forest fires is a challenging problem that is affected by numerous potential sources of error. The proposed system addresses them by exploiting the complementarities between infrared and visual cameras located at different ground locations together with others onboard Unmanned Aerial Systems (UAS). The system applies image processing and geo-location techniques to obtain forest-fire measurements individually from each camera and then integrates the results from all the cameras using statistical data fusion techniques. The proposed system has been extensively tested and validated in close-to-operational conditions in field fire experiments with controlled safety conditions carried out in Portugal and Spain from 2001 to 2006.

Highlights

  • Wildfires destroy thousands of hectares each year and incur very high social, environmental and economic costs

  • In the experiments carried out, they were measured with Differential GPS and Inertial Measurement Units both for cameras on ground stations and onboard Unmanned Aerial Systems (UAS)

  • Terrain projection is not as robust to errors in the location of the camera, see Figure 11. To prevent these errors the location of the UAS is measured with Differential GPS (DGPS)

Read more

Summary

Introduction

Wildfires destroy thousands of hectares each year and incur very high social, environmental and economic costs. Forest-fire fighting is traditionally based on estimations made by fire fighting experts from visual observations directly on the terrain or by analyzing data provided by sensors. To the best of our knowledge, the system presented in this paper is one of the first forest-fire measurement systems that integrate results from visual and infrared cameras distributed at both fixed locations and cameras onboard UAS. It has been extensively validated in forest-fire field experiments in close-to-operational conditions in which plots of land of up to 2.5 hectares were burned.

Related Work and Motivation
General Description
Sensors
System Deployment
Forest-Fire Measurement Processing
Image Pre-Processing
Fire Feature Extraction
Image Calibration and Geo-Referencing
Multi-Camera Forest-Fire Estimation
Sources of Error
Experimental Results
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call