Abstract

To perform online inference efficiently, hashing techniques, devoted to encoding model parameters as binary codes, play a key role in reducing the computational cost of content-aware recommendation (CAR), particularly on devices with limited computation resource. However, current hashing methods for CAR fail to align their learning objectives (e.g., squared loss) with the ranking-based metrics (e.g., Normalized Discounted Cumulative Gain (NDCG)), resulting in suboptimal recommendation accuracy. In this article, we propose a novel ranking-based CAR hashing method based on Factorization Machine (FM), called Discrete Listwise FM (DLFM), for fast and accurate recommendation. Concretely, our DLFM is to optimize NDCG in the Hamming space for preserving the listwise user-item relationships. We devise an efficient algorithm to resolve the challenging DLFM problem, which can directly learn binary parameters in a relaxed continuous solution space, without additional quantization. Particularly, our theoretical analysis shows that the optimal solution to the relaxed continuous optimization problem is approximately the same as that of the original discrete optimization problem. Through extensive experiments on two real-world datasets, we show that DLFM consistently outperforms state-of-the-art hashing-based recommendation techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call