Abstract
This paper proposes a novel technique for the automatic segmentation of dynamic objects, solely using information from a single uncalibrated moving camera and without the need for manual labeling (or any human intervention, for that matter). Matching pairs of sparse features are extracted from subsequent frames, and the resulting optical flow information is divided into two classes (static or dynamic) using the RANSAC algorithm. This initial classification is then used to incrementally train a Gaussian process (GP) classifier that is then able to segment dynamic objects in new images. The GP hyperparameters are optimized online during navigation, with new data being gradually incorporated into the non-parametric model as it becomes available while redundant data is discarded, to maintain a near-constant computational cost. The result is a vector containing the probability that each pixel in the image belongs to a dynamic object, along with the corresponding uncertainty estimate of this classification. Experiments conducted using different robotic platforms, ranging from modified cars (driving at speeds of up to 50 km/h) to portable cameras (with a full six-degree-of-freedom range of motion), show promising results even in highly unstructured environments with cars, buses and pedestrians as dynamic objects. We also show how it is possible to cluster individual dynamic pixels into different object instances, and then further cluster those into semantically meaningful categories without any prior knowledge of the environment. Finally, we provide visual odometry results that testify to the proposed algorithm’s ability to correctly segment (and then remove) dynamic objects from a scene, and how this translates into a more accurate motion estimate between frames.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.