Abstract

To resist the adverse effect of shadow interference, illumination changes, indigent texture and scenario jitter in object detection and improve performance, a background modelling method based on local fusion feature and variational Bayesian learning is proposed. First, U‐LBSP (uniform‐local binary similarity patterns) texture feature, lab colour and location feature are used to construct local fusion feature. U‐LBSP is modified from local binary patterns in order to reduce computational complexity and better resist the influence of shadow and illumination changes. Joint colour and location feature are introduced to deal with the problem of indigent texture and scenario jitter. Then, LFGMM (Gaussian mixture model based on local fusion feature) is updated and learned by variational Bayes. In order to adapt to dynamic changing scenarios, the variational expectation maximisation algorithm is applied for distribution parameters optimisation. In this way, the optimal number of Gaussian components as well as their parameters can be automatically estimated with less time expended. Experimental results show that the authors’ method achieves outstanding detection performance especially under conditions of shadow disturbances, illumination changes, indigent texture and scenario jitter. Strong robustness and high accuracy have been achieved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call