Abstract
Co-clustering algorithms can seek homogeneous sub-matrices into a dyadic data matrix, such as a document-word matrix. Algorithms for co-clustering can be expressed as a non-negative matrix tri-factorization problem such that X≈FSG⊤, which is associated with the non-negativity conditions on all matrices and the orthogonality of F (row-coefficient) and G (column-coefficient) matrices. Most algorithms are based on Euclidean distance and Kullback–Leibler divergence without parameters to control orthogonality. We propose to apply the orthogonality of parameters by adding two penalty terms based on the α-divergence objective function. Orthogonal parametric non-negative matrix tri-factorization uses orthogonal parameters for row and column space, separately. Finally, we compare the proposed algorithms with other algorithms on six real text datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.