Abstract

Spectral clustering is an important clustering method widely used for pattern recognition and image segmentation. Classical spectral clustering algorithms consist of two separate stages: 1) solving a relaxed continuous optimization problem to obtain a real matrix followed by 2) applying K -means or spectral rotation to round the real matrix (i.e., continuous clustering result) into a binary matrix called the cluster indicator matrix. Such a separate scheme is not guaranteed to achieve jointly optimal result because of the loss of useful information. To obtain a better clustering result, in this paper, we propose a joint model to simultaneously compute the optimal real matrix and binary matrix. The existing joint model adopts an orthonormal real matrix to approximate the orthogonal but nonorthonormal cluster indicator matrix. It is noted that only in a very special case (i.e., all clusters have the same number of samples), the cluster indicator matrix is an orthonormal matrix multiplied by a real number. The error of approximating a nonorthonormal matrix is inevitably large. To overcome the drawback, we propose replacing the nonorthonormal cluster indicator matrix with a scaled cluster indicator matrix which is an orthonormal matrix. Our method is capable of obtaining better performance because it is easy to minimize the difference between two orthonormal matrices. Experimental results on benchmark datasets demonstrate the effectiveness of the proposed method (called JSESR).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call