Abstract

Recently various sensors, such as GPS and compass devices, can be cost-effectively manufactured and this allows their deployment in conjunction with mobile video cameras. Hence, recorded clips can automatically be annotated with geospatial information and the resulting georeferenced videos may be used in various Geographic Information System (GIS) applications. However, the research community is lacking large-scale and realistic test datasets of such sensor-fused information to evaluate their techniques since collecting real-world test data requires considerable time and effort. To fill this void, we propose an approach for generating synthetic video meta-data with realistic geospatial properties for mobile video management research. We highlight the essential aspects of the georeferenced video meta-data and present an approach to simulate the behavioral patterns of mobile cameras in the synthetic data. The data generation process can be customized through user parameters for a variety of GIS applications that use mobile videos. We demonstrate the feasibility and applicability of the proposed approach by providing comparisons with real-world data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call