Abstract

For mobile robots to be able to work with and for people and thus operatein our everyday environments, they need to be able to acquire knowledge through perception. In other words they need to collect sensor measure- ments from which they extract meaningful information. This thesis covers some of the essential components of a robot perception system combining omnidirectional vision, odometry, and 3D laser range finders, from modeling to extrinsic calibration, from feature extraction to ego-motion estimation. We covers all these topics from the “point of view” of an omnidirectional camera. The contributions of this work are several and are listed here. The thesis starts with an overview of the geometry of central omnidirectional cameras and gives also an overview of previous calibration methods. The contributions of this section are three. The first two are a new generalized model for describing both dioptric and catadioptric cameras and a calibration method which takes advantage of planar grids shown around the cameras, like the method in use for standard perspective cameras. The third contribution is the implementation of a toolbox for Matlab (called OCamCalib and freely available on-line), which implements the proposed calibration procedure. The second part of the thesis is dedicated to the extraction and matching of vertical features from omnidirectional images. Vertical features are usually very predominant in indoor and outdoor structured environments and can then be very useful for robot navigation. The contribution of this part is a new method for matching vertical lines. The proposed method takes ad-vantage of a descriptor that is very distinctive for each feature. Furthermore, this descriptor is invariant to rotation and slight changes of illumination. The third part of the thesis is devoted to the extrinsic calibration of an omnidirectional camera with the odometry (i.e. wheel encoders) of a mobile robot. The contribution of this part is a new method of automatic self-iii calibration while the robot is moving. The method is based on an extended Kalman filter that combines the encoder readings with the bearing angle observations of one ore more vertical features in the environment. Furthermore, an example of robot motion estimation is shown using the so calibrated camera-odometry system. The fourth part of the thesis is dedicated to the extrinsic calibration of an omnidirectional camera with a 3D laser range finder. The contribution of this method is that it uses no calibration object. Conversely, calibration is performed using laser-camera correspondences of natural points that are manually selected by the user. The novelty of the method resides in a new technique to visualize the usually ambiguous 3D information of range finders. We show that is possible to transform the range information into a new image where natural features of the environment are highlighted. Therefore, finding laser-camera correspondences becomes as easy as image pairing. The last part of the thesis is devoted to visual odometry for outdoor ground vehicles. We show a new method to recover the trajectory of a calibrated omnidirectional camera over several hundred of meters by combining a feature based with an appearance based approach. All the contributions of this thesis are validated through experimental results using both simulated and real data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call