Abstract

The objective of this research is to develop an approach to correct nonlinear errors in the SRTM (Shuttle Radar Topography Mission) elevations, which cannot be handled by most traditional methods. First, a set of uncorrelated feature attributes has been generated from the SRTM digital elevation model (DEM) together with the new freely available Sentinel-2 multispectral imagery, over a dense urban area in Egypt. Second, the SRTM DEM, Sentinel-2 image, and the generated attributes have been applied as input data in an artificial neural network (ANN) classification model to assign each pixel to each of 12 reference elevations. Finally, the posterior probabilities obtained for ANN have been combined based on an inverse probability weighted interpolation (IPWI) approach to estimate revised SRTM elevations. The results were compared with a reference DEM with 1-m vertical accuracy derived through image matching of the Worldview-1 stereo satellite imagery. The process of performance evaluation is based on various statistics such as scatter plots, correlation coefficient (R), standard deviation (SD), and root mean square error (RMSE). The results show that, using the SRTM DEM as a single data source, the RMSE of estimated elevations has improved to 3.04 m. On the other hand, including the Sentinel-2 image has improved the RMSE of elevations to 2.93 m. Including the generated attributes as well has improved the estimated RMSE of the elevations to 2.07 m. Compared with the results from the commonly used multiple linear regression (MLR) method, the improvement in RMSE of the estimated elevations can reach 45%.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.