Forest fires are known for their high level of randomness and unpredictability, which often lead to significant ecological damage and human life loss. Existing forest fire detection technologies are not capable of detecting small-scale flames or smoke in real time, thus failing to meet the demands of real-time detection of forest fires using Unmanned Aerial Vehicles (UAVs). To overcome these limitations, we propose an efficient and lightweight forest fire detection method that utilizes synthetic images and UAVs to achieve real-time and high-precision detection of forest fires against complex backgrounds. Firstly, we propose the Dilation Repconv Cross Stage Partial Network (DRCSPNet), which enhances the detection capabilities for multiscale flames and smoke using multi-branch parallel joint dilation convolution and batch normalization, while effectively extracting features from different stages of forest fires. Secondly, to mitigate challenges associated with extreme lighting in forest scenes and large contrast variation in fire images, we propose a Global Mixed-Attention (GMA) model across feature pyramids to enhance information lost in high-dimensional feature maps and increase the robustness of the model through a multiscale fusion strategy. Finally, we present the Lite-Path Aggregation Network (Lite-PAN) with varying scales to improve effective feature flow for multilevel forest fires, addressing challenges that arise from various climatic conditions. Furthermore, we employ Unreal Engine 5 to generate forest fire datasets in four scenarios to address the issue of relatively limited aerial forest fire datasets. According to the results of the experiment, our proposed method achieves 58.39% mAP(mean Average Precision) with 5.703 GFLOPs (Giga Floating Point Operations Per Second) while yielding a frame rate of 33.5 Frames Per Second (FPS) on NVIDIA Jetson NX. Extensive experiment results demonstrate our method has the advantage of being in real time, extremely accurate, and easily implementable compared to state-of-the-art techniques.
Read full abstract