Abstract
Smoke detection is a key component of disaster and accident detection. Despite the wide variety of smoke detection methods and sensors that have been proposed, none has been able to maintain a high frame rate while improving detection performance. In this paper, a smoke detection method for surveillance cameras is presented that relies on shape features of smoke regions as well as color information. The method takes advantage of the use of a stationary camera by using a background subtraction method to detect changes in the scene. The color of the smoke is used to assess the probability that pixels in the scene belong to a smoke region. Due to the variable density of the smoke, not all pixels of the actual smoke area appear in the foreground mask. These separate pixels are united by morphological operations and connected-component labeling methods. The existence of a smoke region is confirmed by analyzing the roughness of its boundary. The final step of the algorithm is to check the density of edge pixels within a region. Comparison of objects in the current and previous frames is conducted to distinguish fluid smoke regions from rigid moving objects. Some parts of the algorithm were boosted by means of parallel processing using compute unified device architecture graphics processing unit, thereby enabling fast processing of both low-resolution and high-definition videos. The algorithm was tested on multiple video sequences and demonstrated appropriate processing time for a realistic range of frame sizes.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.