Co-clustering algorithms can seek homogeneous sub-matrices into a dyadic data matrix, such as a document-word matrix. Algorithms for co-clustering can be expressed as a non-negative matrix tri-factorization problem such that X≈FSG⊤, which is associated with the non-negativity conditions on all matrices and the orthogonality of F (row-coefficient) and G (column-coefficient) matrices. Most algorithms are based on Euclidean distance and Kullback–Leibler divergence without parameters to control orthogonality. We propose to apply the orthogonality of parameters by adding two penalty terms based on the α-divergence objective function. Orthogonal parametric non-negative matrix tri-factorization uses orthogonal parameters for row and column space, separately. Finally, we compare the proposed algorithms with other algorithms on six real text datasets.
Read full abstract