Abstract

An omnidirectional camera is a vision system providing a 360° panoramic view of the scene. Such an enhanced field of view can be achieved by either using catadioptric systems, which opportunely combine mirrors and conventional cameras, or employing purely dioptric fish-eye lenses. Omnidirectional cameras can be classified into two classes, central and non-central, depending on whether they satisfy the single effective viewpoint property or not (Baker & Nayar, 1998). As noted in (Svoboda & T. Pajdla, 1997), it is highly desirable that such imaging systems have a single effective viewpoint. When this property is verified, there exists a single center of projection, that is, every pixel in the sensed images measures the irradiance of the light passing through the same viewpoint in one particular direction. The reason a single viewpoint is so desirable is that it allows the user to generate geometrically correct perspective images from the pictures captured by an omnidirectional camera. Moreover, it allows applying the known theory of epipolar geometry, which easily allows the user to perform ego-motion estimation and structure from motion from image correspondences only. As shown in (Baker & Nayar, 1998), central catadioptric systems can be built by combining an orthographic camera with a parabolic mirror, or a perspective camera with a hyperbolic or elliptical mirror. Conversely, panoramic cameras using fish-eye lenses cannot in general be considered central systems, but the single viewpoint property holds approximately true for some camera models (Micusik & Pajdla, 2003). In this chapter, we focus on calibration of central omnidirectional cameras, both dioptric and catadioptric. After outlining previous works on omnidirectional camera calibration, we describe our novel procedure and provide a practical Matlab Toolbox, which allows any inexpert user to easily calibrate his own camera. Accurate calibration of a vision system is necessary for any computer vision task requiring extracting metric information of the environment from 2D images, like in ego-motion estimation and structure from motion. While a number of calibration methods has been developed for standard perspective cameras (Zhang, 2000), little work on omnidirectional cameras has been done. The first part of this chapter will present a short overview about previous methods for calibration of omnidirectional cameras. In particular, their limitations will be pointed out. The second part of this chapter will present our calibration technique whose performance is evaluated through calibration experiments. Then, we will present our

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call