Abstract

Cameras form an essential part of any autonomous surface vehicle’s sensor package, both for COLREGs compliance to detect light signals and for identifying and tracking other vessels. Due to limited fields of view compared to more traditional autonomy sensors such as lidars and radars, an autonomous surface vessel will typically be equipped with multiple cameras which can induce biases when used in tracking if a target is present in multiple image frames. In this work, we propose a novel pipeline for camera-based maritime tracking that combines georeferencing with clustering-based multi-camera fusion for bias-free camera measurements with target range estimates. Using real-world datasets collected using the <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">milliAmpere</i> research platform the performance of this pipeline exceeded a lidar benchmark across multiple performance measures, both in pure detection performance and as part of a JIPDA-based tracking system.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.