Abstract

Researchers have proposed the use of unmanned aerial vehicles (UAVs) in humanitarian relief to search for victims in disaster‐affected areas. Once UAVs must search through the entire affected area to find victims, the path‐planning operation becomes equivalent to an area coverage problem. In this study, we propose an innovative method for solving such problem based on a Partially Observable Markov Decision Process (POMDP), which considers the observations made from UAVs. The formulation of the UAV path planning is based on the idea of assigning higher priorities to the areas that are more likely to have victims. We applied the method to three illustrative cases, considering different types of disasters: a tornado in Brazil, a refugee camp in South Sudan, and a nuclear accident in Fukushima, Japan. The results demonstrated that the POMDP solution achieves full coverage of disaster‐affected areas within a reasonable time span. We evaluate the traveled distance and the operation duration (which were quite stable), as well as the time required to find groups of victims by a detailed multivariate sensitivity analysis. The comparisons with a Greedy Algorithm showed that the POMDP finds victims more quickly, which is the priority in humanitarian relief, whereas the performance of the Greedy focuses on minimizing the traveled distance. We also discuss the ethical, legal, and social acceptance issues that can influence the application of the proposed methodology in practice.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call