Abstract

AbstractStatistical machine learning models are a concise representation of probabilistic dependencies among the attributes of an object. Most of the models assume that training and testing data come from the same distribution. Transfer learning has emerged as an essential technique to handle scenarios where such an assumption does not hold, as it relies on leveraging the knowledge acquired in one or more learning tasks as a starting point to solve a new task. Statistical Relational Learning (SRL) extends statistical learning to represent and learn from data with several objects and their relations. In this way, SRL deals with data with a rich vocabulary composed of classes, objects, their properties, and relationships. When employing transfer learning to SRL, the primary challenge is to transfer the learned structure, mapping the vocabulary from a source domain to a different target domain. To address the problem of transferring across domains, we propose TransBoostler, which uses pre-trained word embeddings to guide the mapping as the name of a predicate usually has a semantic connotation that can be mapped to a vector space model. After transferring, TransBoostler employs theory revision operators further to adapt the mapped model to the target data. In the experimental results, TransBoostler has successfully transferred trees from a source to a distinct target domain, performing equal or better than previous work but requiring less training time.KeywordsTransfer learningStatistical relational learningWord embeddings

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call