Abstract

The first never-ending learning system reported in literature is named NELL (Never-Ending Language Learning). The main goal of the NELL system is to learn to read the web better each day, and to store the gathered knowledge in a never-ending growing knowledge base (KB). Since NELL's KB continuously grows each day, it does not contain all instances of every category, neither all instances of every relation described in the ontology. Thus, in this paper, we have investigated a methodology that can help NELL to populate its own KB using Bayesian networks (BN). To do so, we have applied two BN learning algorithms called DMBC and VOMOS to induce BN from facts (knowledge) already stored in the NELL's KB. The empirical results have showed that both algorithms are promising to solve the problem of representing semantic relations and extending the NELL's ontology. Besides, the BN induced by DMBC and VOMOS have presented good inference results, suggesting another alternative to learn new facts and assist to populate the NELL's KB.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call