Abstract

Feature based methods for ego-motion estimation are widely used in computer vision but they must deal with errors in feature tracking. In this paper, we propose a robust real-time method for ego-motion estimation by assuming an affine motion of the background from the previous to the current frame. A new clustering technique is applied on image's subareas to select in a fast and reliable way three features for the affine transform computation. The previous frame after being warped according to the computed affine transform is processed with the current frame by a change detection method in order to detect mobile objects. Results are presented in the context of a visual-based surveillance system for monitoring outdoor environments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call