Abstract

Firefighting robots are actively being researched to reduce firefighter injuries and deaths as well as increase their effectiveness on performing tasks. There has been difficulty in making firefighting robots autonomous because the commonly used sensors for autonomous robot navigation do not perform well in fire smoke-filled environments where low visibility and high temperature are present. In order to overcome these limitations, a multi-spectral vision system was developed that uses sensor fusion between stereo thermal infrared (IR) vision and frequency modulated-continuous wave (FMCW) radar to locate objects through zero visibility smoke in real-time. In this system, the stereo IR vision was used to obtain 3-D information about the scene while the radar provided more accurate distances of objects in the field of view. Through globally matching radar objects with those in the 3-D image, the accuracy of the stereo IR vision map was updated removing the far-field inaccuracy of the stereo IR as well as ghost objects created due to stereo mismatch. The system was sufficiently fast to provide real-time matching of objects in the scene allowing for dynamic reaction object tracking and locating. Through large-scale fire experiments with and without smoke in the field of view, the distance error for the stereo IR vision was reduced from 1% to 19.0% to 1% to 2% due to sensor fusion of the stereo IR with FMCW radar.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.