Abstract

Considerable attention to kernel subspace clustering has emerged in recent years due to its excellent performance in grouping data points from multiple manifolds. In particular, multiple kernel learning (MKL) is commonly used in kernel subspace clustering models to alleviate the difficulty of choosing a suitable kernel function, which brings out numerous MKL-based subspace clustering methods. Despite these successes, these MKL methods cannot guarantee that the feature representation produced by the learned kernel function is easy-to-divide, which is critical for the clustering task. Besides, the weighting strategies in these methods ignore the respective fitting ability of each base kernel, which may result in inappropriate weight assignment and inferior performance. To address these problems, this paper proposes a Self-Paced Smooth Multiple Kernel Subspace Clustering (Spaks) method. In Spaks, we first incorporate a feature smoothing regularization into the kernel subspace clustering model to facilitate the smoothness of feature representation on the affinity graph, thus ensuring that the data points between different subspaces are sufficiently separable. Then we implement the multiple kernel mechanism in the self-paced learning framework. In this way, the weights of base kernels depend on their corresponding single-kernel optimization objective function value, which indicates that a greater weight will be assigned to a kernel with a smaller objective. At last, we optimize the formulated model by an alternative iteration scheme, so that the optimal affinity matrix is obtained for clustering. Extensive experiments on seven benchmark data sets validate the superiority of the proposed Spaks method on the clustering task.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call