Abstract
The quality of embeddings is crucial for downstream tasks in knowledge graphs. Researchers usually introduce neural network architecture search into knowledge graph embedding for machine automatic construction of appropriate neural networks for each dataset. An existing approach is to divide the search space into macro search space and micro search space. The search strategy for micro space is based on one-shot weight sharing strategy, but it will lead to all the information obtained from the previous supernet training is discarded and the advantages of one-shot algorithm are not fully utilized. In this paper, we conduct experiments on common datasets for two important downstream tasks of knowledge graph embedding entity alignment and link prediction problems, respectively-and compare the search performance with existing manually designed neural networks as well as good neural network search algorithms. The results show that the improved algorithm can search better architectures for the same time when experiments are performed on the same dataset; the improved algorithm takes less time to search architectures with similar performance. Also, the improved algorithm searched the model on the dataset due to the human optimal level.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.