Abstract
We focus on the supervised binary classification problem, which consists in guessing the label Y associated to a co-variate X∈ℝd, given a set of n independent and identically distributed co-variates and associated labels (Xi,Yi). We assume that the law of the random vector (X,Y) is unknown and the marginal law of X admits a density supported on a set ${\mathcal{A}}$. In the particular case of plug-in classifiers, solving the classification problem boils down to the estimation of the regression function $\eta(X)=\mathbb {E}[Y|X]$. Assuming first ${\mathcal{A}}$ to be known, we show how it is possible to construct an estimator of η by localized projections onto a multi-resolution analysis (MRA). In a second step, we show how this estimation procedure generalizes to the case where ${\mathcal{A}}$ is unknown. Interestingly, this novel estimation procedure presents similar theoretical performances as the celebrated local-polynomial estimator (LPE). In addition, it benefits from the lattice structure of the underlying MRA and thus outperforms the LPE from a computational standpoint, which turns out to be a crucial feature in many practical applications. Finally, we prove that the associated plug-in classifier can reach super-fast rates under a margin assumption.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.