Aiming at the existing UAV fire detection system with low small target detection accuracy, a high leakage rate, and a slow rate, an improved YOLOv5 UAV flame detection algorithm is proposed. First, the anchor box clustering is optimized using the K-mean++algorithm to reduce the classification error rate. Second, the original backbone network is enhanced with the CBAM attention mechanism, which scans the whole globe to obtain the target area with a high weighting proportion and needs to be focused on. Replace the PANet network with the BiFPN network in the neck and introduce jump connections when performing feature fusion, which can better retain the semantic information of high-level and low-level features. Finally, the α-IoU loss function is added to achieve the regression accuracy of different levels of the bounding box by modulating α, which improves the detection accuracy of small datasets and the robustness to noise. According to the experimental results, using a randomly segmented dataset, the modified YOLOv5 algorithm obtains a mAP value of 80.2%, which is 6.7% higher than the original YOLOv5 method, while maintaining an FPS of 64 frames per second. The method helps to improve the accuracy of UAVs for fire monitoring, and the performance is better than the existing flame detection algorithms, which meet the requirements of practical applications.