Knowledge graph embedding (KGE) is typically used for link prediction to automatically predict missing links in knowledge graphs. Current KGE models are mainly based on complicated mathematical associations, which are highly expressive but ignore the uniformity behind the classical bilinear translational model TransE, a model that embeds all entities of knowledge graphs in a uniform space, enabling accurate embeddings. This study analyses the uniformity of TransE and proposes a novel KGE model called ConvUs that follows uniformity with expressiveness. Based on the convolution neural network (CNN), ConvUs proposes constraints on convolution filter values and employs a multi-layer, multi-scale CNN architecture with a non-parametric L2 norm-based scoring function for the calculation of triple scores. This addresses potential uniformity-related issues in existing CNN-based KGE models, allowing ConvUs to maintain a uniform embedding space while benefiting from the powerful expressiveness of CNNs. Furthermore, circular convolution is applied to alleviate the potential orderliness contradictions, making ConvUs more suitable for conducting uniform space KGE. Our model outperformed the base model ConvKB and several baselines on the link prediction benchmark WN18RR and FB15k-237, demonstrating strong applicability and generalization and indicating that the uniformity of embedding space with high expressiveness enables more efficient knowledge graph embeddings.
Read full abstract