Abstract

Neural networks are capable of learning complex concepts and tasks, given abundant training data. In real-world applications where data collection can be difficult, integrating domain knowledge into the model can reduce the burden on data requirements and allow human experts greater control over model decisions. This paper focuses on incorporating conditional statements for tabular data as classification rules, which have a simple structure and are easy to construct. We introduce a general rule loss constraint to guide neural network training in a model agnostic manner, and propose confidence learning to automatically weigh the contribution of multiple candidate rules. Experimental evaluation with three real-world datasets shows that the rule loss can substantially increase model performance, particularly when training data is limited.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call