Abstract

Urban search and rescue missions require rapid intervention to locate victims and survivors in the affected environments. To facilitate this activity, Unmanned Aerial Vehicles (UAVs) have been recently used to explore the environment and locate possible victims. In this paper, a UAV equipped with multiple complementary sensors is used to detect the presence of a human in an unknown environment. A novel human localization approach in unknown environments is proposed that merges information gathered from deep-learning-based human detection, wireless signal mapping, and thermal signature mapping to build an accurate global human location map. A next-best-view (NBV) approach with a proposed multi-objective utility function is used to iteratively evaluate the map to locate the presence of humans rapidly. Results demonstrate that the proposed strategy outperforms other methods in several performance measures such as the number of iterations, entropy reduction, and traveled distance.

Highlights

  • The current advanced development of aerial robotics research has enabled their deployment in a variety of tasks such as urban search and rescue (USAR) [1,2,3,4], infrastructure inspection [5], 2D/3D reconstruction for building using point cloud generated from Unmanned Aerial Vehicles (UAVs) Images [6] or 3D reconstruction of historical cities by using the laser scanning and digital photogrammetry [7], and mining [8]

  • An adaptive grid sampling algorithm (AGSA) to resolve the local minimum issue occurs in the Regular Grid Sampling Approach (RGSA)

  • The procedure for thermal-based victim localization summarized in Algorithm 2 involves two stages: victim detection, where the victim is found in the thermal image based on heat signature; and thermal occupancy mapping; where detection confidences obtained from the thermal detector is stored

Read more

Summary

Introduction

The current advanced development of aerial robotics research has enabled their deployment in a variety of tasks such as urban search and rescue (USAR) [1,2,3,4], infrastructure inspection [5], 2D/3D reconstruction for building using point cloud generated from UAVs Images [6] or 3D reconstruction of historical cities by using the laser scanning and digital photogrammetry [7], and mining [8]. A probabilistic sensor fusion technique is performed on the generated maps to create a new merged map This map is used by the robot for environment exploration to localize the presence of victims. We present a multi-sensor-based NBV system to locate victims in a simulated urban search and rescue environment. The proposed multi-sensor system uses different sensor sources, namely vision, thermal, and wireless, to generate victim-location maps which are fused into a merged map using probabilistic Bayesian framework. The proposed multi-objective utility function evaluates viewpoints based on three desired objectives namely exploration, victim detection and traveled distance. A multi-sensor fusion approach that uses vision, thermal, and wireless sensors to generate a probabilistic 2D victim-location maps.

Related Work
Victim Detection
Mapping
Exploration
Viewpoint Sampling
Viewpoint Evaluation
Termination Conditions
Proposed Approach
Vision-Based Victim Localization
Thermal-Based Victim Localization
Wireless-Based Victim Localization
Multi-Sensor Occupancy Grid Map Merging
Experimental Results
Vehicle Model and Environment
Tests Scenario and Parameters
Discussion Results
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call