Abstract

SUMMARY Conformal inference is a popular tool for constructing prediction intervals. We consider here the scenario of post-selection/selective conformal inference, that is, prediction intervals are reported only for individuals selected from unlabelled test data. To account for multiplicity, we develop a general split conformal framework to construct selective prediction intervals with the false coverage-statement rate control. We first investigate the false coverage rate–adjusted method of Benjamini & Yekutieli (2005) in the present setting, and show that it is able to achieve false coverage-statement rate control, but yields uniformly inflated prediction intervals. We then propose a novel solution to the problem called selective conditional conformal prediction. Our method performs selection procedures on both the calibration set and test set, and then constructs conformal prediction intervals for the selected test candidates with the aid of the conditional empirical distribution obtained by the post-selection calibration set. When the selection rule is exchangeable, we show that our proposed method can exactly control the false coverage-statement rate in a model-free and distribution-free guarantee. For nonexchangeable selection procedures involving the calibration set, we provide non-asymptotic bounds for the false coverage-statement rate under mild distributional assumptions. Numerical results confirm the effectiveness and robustness of our method under false coverage-statement rate control and show that it achieves more narrowed prediction intervals over existing methods across various settings.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call