Abstract
Differencing digital terrain models (DTMs) generated from multitemporal airborne light detection and ranging (lidar) data provide accurate and detailed information about three-dimensional (3D) changes on the Earth. However, noticeable spurious errors along flight paths are often included in the differencing results, hindering the accurate analysis of the topographic changes. This paper proposes a new scalable method to alleviate the problematic systematic errors with a high degree of automation in consideration of the practical limitations raised when processing the rapidly increasing amount of large-scale lidar datasets. The proposed method focused on estimating the displacements caused by vertical positioning errors, which are the most critical error source, and adjusting the DTMs already produced as basic lidar products without access to the point cloud and raw data from the laser scanner. The feasibility and effectiveness of the proposed method were evaluated with experiments with county-level multitemporal airborne lidar datasets in Indiana, USA. The experimental results demonstrated that the proposed method could estimate the vertical displacement reasonably along the flight paths and improve the county-level lidar differencing results by reducing the problematic errors and increasing consistency across the flight paths. The improved differencing results presented in this paper are expected to provide more consistent information about topographic changes in Indiana. In addition, the proposed method can be a feasible solution to upcoming problems induced by rapidly increasing large-scale multitemporal lidar given recent active government-driven lidar data acquisition programs, such as the U.S. Geological Survey (USGS) 3D Elevation Program (3DEP).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.