Abstract

ABSTRACTWhen mapping benthic habitats using remotely sensed data, the ability to discriminate between pairs of habitats is a key measure of the usefulness of a set of one or more input covariates. In the case where some input data is already available, but a superior map is sought, map-makers would like to know which additional remote sensing data would make the greatest improvement to the quality of their maps. Depending on the purpose of the map, this could be measured by the extent to which a selected pair of habitats is discriminated.This study exploits an existing data-rich study site in order to provide guidance for the use of remote sensing technology in regions where such data do not exist already. LiDAR (light detection and ranging) reflectivity, multibeam backscatter, World View 2 (WV2) bands 1–4, multibeam bathymetry, and depth-derived variables are analysed to determine the extent to which they enable benthic habitats of interest to be discriminated from one another in a statistical sense. Ground truth is employed in the form of towed video.Quantitative results are tabulated for each of the six pairs of four key habitat classes: macroalgae, seagrass, sand, and reef. The technique of Canonical Variate Analysis (CVA) is used to calculate ratios of between-class to within-class variation and cross-validated error rate estimates are calculated for the best combination of N variables, where N varies from 1 to 8. It is found that: Reef and Macroalgae classes cannot be statistically distinguished with the technologies and training methods studied here; WV2 augmented with depth provides good discrimination between the separable classes; multibeam echosounder depth and backscatter data both provide good information for mapping cover types, but in general are not as useful as optical data if it is available. LiDAR reflectivity is a very useful covariate, which has comparable discriminatory power to any one of the first three WV2 bands, with the added potential to penetrate to greater depths than the passive satellite sensors.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.