Abstract
Image-based mobile mapping systems enable an efficient acquisition of georeferenced image sequences, which can be used for geo-data capture in subsequent steps. In order to provide accurate measurements in a given reference frame while e.g. aiming at high fidelity 3D urban models, high quality georeferencing of the captured multi-view image sequences is required. Moreover, sub-pixel accurate orientations of these highly redundant image sequences are needed in order to optimally perform steps like dense multi-image matching as a prerequisite for 3D point cloud and mesh generation. While direct georeferencing of image-based mobile mapping data performs well in open areas, poor GNSS coverage in urban canyons aggravates fulfilling these high accuracy requirements, even with high-grade inertial navigation equipment. Hence, we conducted comprehensive investigations aiming at assessing the quality of directly georeferenced sensor orientations as well as the expected improvement by image-based georeferencing in a challenging urban environment. Our study repeatedly delivered mean trajectory deviations of up to 80 cm. By performing image-based georeferencing using bundle adjustment for a limited set of cameras and a limited number of ground control points, mean check point residuals could be lowered from approx. 40 cm to 4 cm. Furthermore, we showed that largely automated image-based georeferencing is capable of detecting and compensating discontinuities in directly georeferenced trajectories.
Highlights
In recent years, image-based mobile mapping has evolved into a highly efficient and accurate mapping technology as it enables capturing an enormous amount of metric image data in a short time period with no or just minimal road traffic interference
Whereas most of the residuals for the ground control points (GCP) of stereo image sequence 2.1 are smaller than 2 cm, the highest value amounts to 19 cm i.e. 1.9 pixel which partly contributes to the largest 3D RMSE value of 47 mm
Since we newly set tie point accuracy to 0.3 pixel and defined 0.5 pixel for image observations to ground control points, sequences acquired in July 2014 were reprocessed which led to slightly different results compared to Cavegn et al (2015) and to Nebiker et al (2015)
Summary
Image-based mobile mapping has evolved into a highly efficient and accurate mapping technology as it enables capturing an enormous amount of metric image data in a short time period with no or just minimal road traffic interference. One of the main features of our mobile mapping system is the application of multiple cameras, which are used for dense 3D data capture applying the multi-view-stereo matching described in Cavegn et al (2015) Such a configuration especially requires a high quality relative orientation of the image sequences. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLI-B1, 2016 XXIII ISPRS Congress, 12–19 July 2016, Prague, Czech Republic loosely coupled GNSS Kalman filter This approach incorporating additional stereovision-based position updates was later on exploited by Eugster et al (2012). Whereas they demonstrated a consistent improvement of the absolute 3D measurement accuracy from several decimeters to a level of 510 cm for land-based mobile mapping, Ellum & El-Sheimy (2006) achieved no improvement in mapping accuracy. A systematic study aiming at assessing the quality of directly georeferenced sensor orientations in a challenging urban environment with frequent GNSS degradations is presented in section 4 and section 5 gives further results on the potential and quality of image-based georeferencing
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.