Abstract
The real-time monitoring and analysis system based on video images has been implemented to detect fire accidents on site. While most segmentation methods can accurately segment smoke areas in bright and clear images, it becomes challenging to obtain high performance due to the low brightness and contrast of low-light smoke images. An image enhancement model cascaded with a semantic segmentation model was proposed to enhance the segmentation effect of low-light smoke images. The modified Cycle-Consistent Generative Adversarial Network (CycleGAN) was used to enhance the low-light images, making smoke features apparent and improving the detection ability of the subsequent segmentation model. The smoke segmentation model was based on Transformers and HRNet, where semantic features at different scales were fused in a dense form. The addition of attention modules of spatial dimension and channel dimension to the feature extraction units established the relationship mappings between pixels and features in the two-dimensional spatial directions, which improved the segmentation ability. Through the Foreground Feature Localization Module (FFLM), the discrimination between foreground and background features was increased, and the ability of the model to distinguish the thinner positions of smoke edges was improved. The enhanced segmentation method achieved a segmentation accuracy of 91.68% on the self-built dataset with synthetic low-light images and an overall detection time of 120.1 ms. This method can successfully meet the fire detection demands in low-light environments at night and lay a foundation for expanding the all-weather application of initial fire detection technology based on image analysis.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.