Abstract

Existing formulations for optical flow estimation and image segmentation have used Bayesian Networks and Markov Random Field (MRF) priors to impose smoothness of segmentation. These approaches typically focus on estimation in a single time slice based on two consecutive images. We develop a motion segmentation framework for a continuous stream of images using inference in a corresponding Dynamic Bayesian Network (DBN) formulation. It realises a spatio-temporal integration of optical flow and segmentation information using a transition prior that incorporates spatial and temporal coherence constraints on the flow field and segmentation evolution. The main contribution is the embedding of these particular assumptions into a DBN formulation and the derivation of a computationally ecient two-filter inference method based on factored belief propagation (BP) that allows for onand oine parameter optimisation. The spatio-temporal coupling implemented in the transition priors ensures smooth flow field and segmentation estimates without using MRFs. The algorithm is tested on synthetic and real image sequences.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.