Abstract
The Markov random fields are widely used models in machine vision applications. The mean-field inference methods are popular in the inference problem of markov random fields (MRFs), however, it requires large number of computation especially for dense markov random fields. Though several parallel mean-field methods have been developed to reduce computation complexity, none of them is global convergent. In this paper, a mean-field inference method that guaranteed to converge to a global optimum is developed. The experiment results show that the proposed method can handle inference problem of dense random fields effectively in image segmentation application.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have