Abstract
Knowledge graphs have shown increasing value in semantic search, intelligent Q&A, data analysis, natural language processing, visual understanding, IoT devices, etc. It is undeniable that knowledge graphs have become the mandatory path for the artificial intelligence fields development. However, the information contained in existing knowledge graphs is incomplete, which attracts a large number of researchers to enhance the completeness of knowledge graphs utilizing knowledge graph completion methods. Most traditional embedding-based knowledge graph completion models use structural information within data-rich triples from the knowledge graph is limited by the long-tail distribution of the relations in the triples. To address this problem, we propose a Bert-based knowledge graph completion algorithm for few-shot knowledge graphs, with the main goal of implementing the knowledge graph completion task with only a few sample triples of training instances. We improve the baseline model GMatching for handling few-shot knowledge graphs by introducing the Bert pre-trained linguistic representation model to enhance the semantic representation of entities and relations in the triples. Through experiments, we demonstrate that our improved model B-GMatching achieves good results.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.