Abstract

Intelligent detection and processing capabilities can be instrumental in improving the safety, efficiency, and successful completion of rescue missions conducted by firefighters in emergency first response settings. The objective of this research is to create an automated system that is capable of real-time, intelligent object detection and recognition and facilitates the improved situational awareness of firefighters during an emergency response. We have explored state-of-the-art machine/deep learning techniques to achieve this objective. The goal of this work is to enhance the situational awareness of firefighters by effectively exploiting the infrared video that is actively recorded by firefighters on the scene. To accomplish this, we use a trained deep Convolutional Neural Network (CNN) system to classify and identify objects of interest from thermal imagery in real-time. In the midst of those critical circumstances created by a structure fire, this system is able to accurately inform the decision-making process of firefighters with up-to-date scene information by extracting, processing, and analyzing crucial information. Utilizing the new information produced by the framework, firefighters are able to make more informed inferences about the circumstances for their safe navigation through such hazardous and potentially catastrophic environments.

Highlights

  • The application of Convolutional Neural Network (CNN) technology abounds in the Surveillance and Defense fields [4, 8, 16, 26] but very little research is documented in applying these principles to overcoming the navigational challenges faced by firefighters in live fire events

  • The following results present the visualization of the features extracted by the convolutional section of the network, the F1 scores and achieved accuracy of the network, confusion matrices and ROC curves to estimate the false alarm versus the detection probability of the network

  • Our research provides a mechanism that can supplement firefighters with real time information and offer guidance by automatically interpreting the fireground from the information provided by the hand held thermal cameras already in use by firefighters

Read more

Summary

Introduction

The application of CNN technology abounds in the Surveillance and Defense fields [4, 8, 16, 26] but very little research is documented in applying these principles to overcoming the navigational challenges faced by firefighters in live fire events. Detection processes can be adversely affected by environmental factors inherent in active fire scenes. Near zero visibility caused by debris, smoke and lack of lighting, and a continuously changing environment can combine to disorient and further inhibit decision making processes, affecting even experienced firefighters. Under such hazardous conditions, lives can be lost due to rescue operation decisions based on incomplete or inaccurate understanding of the most current environmental conditions within the structure. We propose an Artificial Neural Network-based system capable of autonomously identifying objects and humans in the scene of the event in real time to improve on-the-ground knowledge that dictates decision making protocol.

Objectives
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.