Abstract

Collaborative filtering (CF) works have demonstrated the robust capabilities of Shallow Autoencoders on implicit feedback, showcasing highly competitive performance with other reasonable approaches (e.g., iALS and VAE-CF). However, despite their dual advantages of high performance and simple construction, EASE still exhibits several major shortcomings that must be addressed. To be more precise, the scalability of EASE is limited by the number of items, which determines the storage and inversion cost of a large dense matrix; the square-loss optimization objective does not consistently meet the recommendation task’s requirement for predicting personalized rankings, resulting in sub-optimal outcomes; the regularization coefficients are sensitive and require re-calibration with different datasets, leading to an exhaustive and time-consuming fine-tuning process. In order to address these obstacles, we propose a novel approach called Similarity-Structure Aware Shallow Autoencoder (AutoS 2 AE) that aims to enhance both recommendation accuracy and model efficiency. Our method introduces three similarity structures: Co-Occurrence, KNN, and NSW graphs, which replace the large dense matrix in EASE with a sparse structure, thus facilitating model compression. Additionally, we optimize the model by incorporating a low-rank training component into the matrix and applying a weighted square loss for improved ranking-oriented approximations. To automatically tune the hyperparameters, we further design two validation losses on the validation set for guidance and update the hyperparameters using the gradients of these validation losses. Both theoretical analyses regarding the introduction of similarity structures and empirical evaluations on multiple real-world datasets demonstrate the effectiveness of our proposed method, which significantly outperforms competing baselines.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call