Abstract

Subspace segmentation or clustering remains a challenge of interest in computer vision when handling complex noise existing in high-dimensional data. Most of the current sparse representation or minimum-rank based techniques are constructed on ℓ1-norm or ℓ2-norm losses, which is sensitive to outliers. Finite mixture model, as a class of powerful and flexible tools for modeling complex noise, becomes a must. Among all the choices, exponential family mixture is extremely useful in practice due to its universal approximation ability for any continuous distribution and hence covers a broader scope of characteristics of noise distribution. Equipped with such a modeling idea, this paper focuses on the complex noise contaminated subspace clustering problem by using finite mixture of exponential power (MoEP) distributions. We then harness a penalized likelihood function to perform automatic model selection and hence avoid over-fitting. Moreover, we introduce a novel prior on the singular values of representation matrix, which leads to a novel penalty in our nonconvex and nonsmooth optimization. The parameters of the MoEP model can be estimated with a Maximum A Posteriori (MAP) method. Meanwhile, the subspace is computed with joint weighted ℓp-norm and Schatten-q quasi-norm minimization. Both theoretical and experimental results show the effectiveness of our method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.