Abstract
This paper proposes two novel time-of-flight based fire detection methods for indoor and outdoor fire detection. The indoor detector is based on the depth and amplitude image of a time-of-flight camera. Using this multi-modal information, flames can be detected very accurately by fast changing depth and amplitude disorder detection. In order to detect the fast changing depth, depth differences between consecutive frames are accumulated over time. Regions which have multiple pixels with a high accumulated depth difference are labeled as candidate flame regions. Simultaneously, the amplitude disorder is also investigated. Regions with high accumulative amplitude differences and high values in all detail images of the amplitude image its discrete wavelet transform, are also labeled as candidate flame regions. Finally, if one of the depth and amplitude candidate flame regions overlap, fire alarm is given. The outdoor detector, on the other hand, only differs from the indoor detector in one of its multi-modal inputs. As depth maps are unreliable in outdoor environments, the outdoor detector uses a visual flame detector instead of the fast changing depth detection. Experiments show that the proposed detectors have an average flame detection rate of 94% with no false positive detections.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.