Abstract

GNN recommendation models like NGCF (Wang et al., 2019), LightGCN (He et al., 2020), map each user/item to a unique embedding vector by an embedding table as input features, and then learn further their representations by leveraging multi-hop neighbors on the bipartite graph. In real world, the embedding layer incurs gigabytes of memory consumption for hundreds of thousands of users and items, which is difficult to be deployed due to a plethora of engineering challenges. There are different methods to reduce the size of an embedding table. However, most hashing-based models fail to capture graph-structure information and keep topological distance for users/items in the compressed embedding space. To this end, we present Position-aware Compositional Embedding (PCE) as a low-memory alternative to the embedding layer. PCE constructs unique embedding for each user/item by combining fixed-size anchor nodes and its attached co-cluster on the graph. PCE incorporates global and co-cluster positions into compositional embeddings, obtaining competitive representation capability in the compressed case. Extensive experiments on three recommendation graphs demonstrate that our PCE exceeds state-of-the-art compression techniques. In particular, compared with complete embedding table schema, PCE has a ∼5% relative loss in Recall@20 averagely and 16x fewer parameters. Moreover, our model can be compressed by 2x while getting even better accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call