Abstract

Because of their large field of view, omnidirectional images can assist with localization tasks to robotic navigation algorithms. Since the images taken by omnidirectional sensors can be mapped to the sphere, the problem of attitude estimation of a 3D camera rotation can be treated as a problem of estimating rotations between spherical images. Usually, this rotation estimation problem has been solved using point correspondences or gradient information of the points of the images, with the respective computational time consuming of those point algorithms. We present an effective solution to the attitude estimation problem using line information of the images through the Radon transform, because line algorithms are less time consuming that those using points. In the formulation of the Radon transform we include a similarity function on the cross product of two images which assigns a weight to all lines pairs, and where this similarity function is integrated over all lines pairs that satisfy a constraint for lines. That is, we formulate the problem to obtain the Euler angles of the 3D camera rotation as a correlation of functions defined on the product of spheres S <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> times S <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> which are acted upon by elements of the direct product groups of rotations SO(3) timesSO(3). Because of the spherical treatment of the data, our approach utilizes the spherical fourier transform and spherical harmonics to produce a solution in the Fourier domain.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call