Abstract

In this paper we propose a new training algorithm for a class of polynomial models. The algorithm is derived using a learning bound for predictors that are convex combinations of functions from simpler classes. In our case, the hypotheses are polynomials over the input features, and they are interpreted as convex combinations of homogeneous polynomials. In addition, the coefficients are restricted to be positive and to sum to 1. This constraint will simplify the interpretation of the model. The training is done by minimizing a surrogate of the learning bound, using an iterative two-phase algorithm. Basically, in the first phase the algorithm decides which monomials of higher degree should be added, and in the second phase the coefficients are recomputed by solving a convex program. We performed several experiments on binary classification datasets from different domains. Experiments show that the algorithm compares favorably in terms of accuracy and speed with other classification methods, including some new interpretable methods like Neural Additive Models and CORELS. In addition, the resulting predictor can sometimes be understood and validated by a domain expert. The code is publicly available.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call