Abstract

Locally weighted jackknife prediction(LW-JP) is a variant of conformal prediction, which can output interval prediction in regression problems built on traditional learning algorithms for point prediction named as underlying algorithms. Although empirical validity and efficiency of LW-JP have been reported in some works, there lacks theoretical understanding of it. This paper gives some theoretical analysis of LW-JP in the asymptotic setting, where the number of the training samples approaches infinity. Under some regularity assumptions and conditions, the asymptotic validity of LW-JP is proved in the nonlinear regression case with heteroscedastic errors. The proof is an extension of the asymptotic analysis of leave-one-out prediction intervals in linear regression with homoscedastic errors. Based on our analysis, two conformal regressors built on LW-JP are proposed and the experimental results showed that the algorithms are not only valid interval predictors, but also achieve the state-of-the-art performance of conformal regressors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call