Abstract

Sparse coding (SC), due to its thorough theoretical property and outstanding effectiveness, is attracting more and more attention in various data representation and data mining applications. However, the optimization of most existing sparse coding algorithms are non-convex and thus prone to become stuck into bad local minima under the framework of alternative optimization, especially when there are many outliers and noisy data. To enhance the learning robustness, in this study, we will present an unified framework named Self-Paced Sparse Coding (SPSC), which gradually includes data into the learning process of SC from easy ones to complex ones by incorporating self-paced learning methodology. It implements a soft instance selection accordingly rather than a heuristic hard strategy sample selection. We also generalize the self-paced learning schema into different levels of dynamic selection on instances, features and elements respectively. Further, we show an optimization algorithm to solve it and a theoretical explanation to analyze the effectiveness of it. Extensive experimental results on the real-world clean image datasets and images with two kinds of corruptions demonstrate the remarkable robustness of the proposed method for high dimensional data representation on image clustering and reconstruction tasks over the state-of-the-arts.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call