Abstract

We consider the framework of competitive prediction, where one provides guarantees compared to other predictive models that are called experts. We propose a universal algorithm predicting finite-dimensional distributions, i.e. points from a simplex, under Kullback–Leibler game. In the standard framework for prediction with expert advice, the performance of the learner is measured by means of the cumulative loss. In this paper we consider a generalisation of this setting and discount losses with time. A natural choice of predictors for the probability games is a class of multinomial logistic regression functions as they output a distribution that lies inside a probability simplex. We consider the class of multinomial logistic regressions to be our experts. We provide a strategy that allows us to ‘track the best expert’ of this type and derive the theoretical bound on the discounted loss of the strategy. We provide the kernelized version of our algorithm, which competes with a wider set of experts from Reproducing Kernel Hilbert Space (RKHS) and prove a theoretical guarantee for the kernelized strategy. We carry out experiments on three data sets and compare the cumulative losses of our algorithm and multinomial logistic regression.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call