Abstract

This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach.

Highlights

  • In the last few years, omnidirectional cameras have received increasing interest from the computer vision community in tasks such as augmented reality, visual surveillance, motion estimation and simultaneous localization and mapping (SLAM)

  • After a successful calibration of the omnidirectional camera, each pixel of any image can be associated with a 3D ray in space

  • Except for the small area where there is image overlap, it is not possible to estimate the distance to the objects from just a set of images acquired at a single location

Read more

Summary

Introduction

In the last few years, omnidirectional cameras have received increasing interest from the computer vision community in tasks such as augmented reality, visual surveillance, motion estimation and simultaneous localization and mapping (SLAM). The last stage takes place underwater and estimates the camera pose with respect to the waterproof housing This calibration procedure is done in three stages rather than in a single combined step. It allows for a smaller number of parameters to be estimated in each individual step, avoiding unwanted correlations among the parameter estimates It allows the use of image sets captured in air, for the estimation of the parameters that are not related with the underwater housing. This way, the intrinsic and extrinsic parameters are not affected by disturbances, such as the non-modeled geometric inaccuracies of the waterproof housing. By contrast, when the OMS is used for object recognition or tracking, the effort will concentrate on achieving the best model possible for the image formation geometry

Related Work
Contributions
Camera Design
Camera Model
Calibration
Initialization
Refinement
Extrinsic Calibration
Underwater Calibration
Ray Tracing
Housing Parameters Optimization
Results
Panorama Composition
Single Camera Calibration
Underwater Housing Optimization
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.