Abstract

To expedite the initial response and ensure the safety of citizens and facilities during a disaster, it is critical to quickly obtain the information around the disaster site. In particular, identifying the location of non-stationary objects, such as vehicles, is as important as that of rescue targets because these objects may impede first responders. As the location of these non-stationary objects cannot be predicted, the most feasible solution is utilizing the mobile sensing imagery provided by citizens near the site. Several studies on object localization have used data from smartphones obtained through participatory sensing. However, most of these studies required multiple images or additional infrastructure, which is unsuitable for sudden and urgent cases. The single-image approach using depth estimation with deep learning is limited as it only estimates the relative location and does not provide the absolute geographic coordinates of interest. To this end, we developed an algorithm to estimate the absolute locations of objects from a single smartphone image. Our solution combines theories of photogrammetry, coordinate transformation, object detection, and depth estimation. First, an image and the corresponding device sensor data were acquired using a smartphone application. Second, target vectors that point to target objects from the smartphone’s perspective center are defined and constructed in a local coordinate system based on photogrammetric geometry, object detection, and depth estimation. Third, the constructed target vectors were transformed from local to geographic coordinates. Finally, the locations of the target objects are calculated and displayed in a geographic coordinate system for a quick decision-making process. The final absolute locations were compared with the ground reference points obtained from real-time kinematic positioning (RTK) using the global navigation satellite system (GNSS) and the total station (TS). 70 target objects were identified from the 18 images, and the estimated location had an average absolute location error of 4.118 m.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.