Abstract

A new algorithm for data-adaptive, large-scale, computationally efficient estimation of bathymetry is proposed. The algorithm uses a first pass over the observations to construct a spatially varying estimate of data density, which is then used to predict achievable estimate sample spacing for robust depth estimation across the area of interest. A low-resolution estimate of depth is also constructed during the first pass as a guide for further work. A piecewise-regular grid is then constructed following the sample spacing estimates, and accurate depth is finally estimated using the composite refined grid and an extended and re-implemented version of the cube algorithm. Resource-efficient data structures allow for the algorithm to operate over large areas and large datasets without excessive compute resources; modular design allows for more complex spatial representations to be included if required. The proposed system is demonstrated on a pair of hydrographic datasets, illustrating the adaptation of the algorithm to different depth- and sensor-driven data densities. Although the algorithm was designed for bathymetric estimation, it could be readily used on other two dimensional scalar fields where variable data density is a driver.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call