Abstract

Omnidirectional cameras are often used with robots to provide an immersive view of the surroundings. However, robots often have unstable motion and undergo rotation. In this work, we formulate a method to stabilize the viewpoint of Omnidirectional videos by removing the rotation using dense optical flow fields. The method works by first projecting each Omnidirectional video frame on a unit sphere, measuring the optical flow at every point on the sphere, and finding the direction that minimizes its rotational component in a frame by frame manner. The Omnidirectional video is de-rotated and a ‘rotation-less, translation-only’ viewpoint is generated. Such a technique is well suited to work in any environment, even with sparse texture or repeating patterns where feature correspondence based methods may fail.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call