Abstract
A new algorithm for data-adaptive, large-scale, computationally efficient estimation of bathymetry is proposed. The algorithm uses a first pass over the observations to construct a spatially varying estimate of data density, which is then used to predict achievable estimate sample spacing for robust depth estimation across the area of interest. A low-resolution estimate of depth is also constructed during the first pass as a guide for further work. A piecewise-regular grid is then constructed following the sample spacing estimates, and accurate depth is finally estimated using the composite refined grid and an extended and re-implemented version of the cube algorithm. Resource-efficient data structures allow for the algorithm to operate over large areas and large datasets without excessive compute resources; modular design allows for more complex spatial representations to be included if required. The proposed system is demonstrated on a pair of hydrographic datasets, illustrating the adaptation of the algorithm to different depth- and sensor-driven data densities. Although the algorithm was designed for bathymetric estimation, it could be readily used on other two dimensional scalar fields where variable data density is a driver.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.