Abstract

The disposal of unexploded ordnance (UXOs) at sea is a global problem. The mapping and remediation of historic UXOs can be assisted by autonomous underwater vehicles (AUVs) carrying sensor payloads such as synthetic aperture sonar (SAS) and optical cameras. AUVs can image large areas of the seafloor in high resolution, motivating an automated approach to UXO detection. Modern methods commonly use supervised machine learning which requires labelled examples from which to learn. This work investigates the often-overlooked labelling process and resulting dataset using an example historic UXO dumpsite at Skagerrak. A counterintuitive finding of this work is that optical images cannot be relied on for ground truth as a significant number of UXOs visible in SAS images are not in optical images, presumed buried. Given the lack of ground truth, we use an ordinal labelling scheme to incorporate a measure of labeller uncertainty. We validate this labelling regime by quantifying label accuracy compared to optical labels with high confidence. Using this approach, we explore different taxonomies and conclude that grouping objects into shells, bombs, debris, and natural gave the best trade-off between accuracy and discrimination.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call