This paper addresses the inherent limitations in traditional color modeling techniques for measuring the flame equivalence ratio (Φ), particularly focusing on the subjectivity involved in threshold settings and the challenges posed by uneven 2D color distribution. To overcome these issues, this study introduces an attention-based convolutional neural network (ACN) model, a novel approach that transcends the conventional reliance on B/G color features (Tf). The ACN model leverages adaptive feature extraction, augmented by a spatial attention mechanism, to more effectively analyze flame images. By amplifying key features, autonomously minimizing background noise, and standardizing variations in color distribution, the ACN model in this experiment achieved a prediction accuracy of 99%, with a 76% reduction in error rate compared to the original model, significantly improving the accuracy and objectivity of flame Φ measurement. This method marks a substantial development in the precision and reliability of flame analysis.