Abstract
One main focus of learning theory is to find optimal rates of convergence. In classification, it is possible to obtain optimal fast rates (faster than $n^{-1/2}$) in a minimax sense. Moreover, using an aggregation procedure, the algorithms are adaptive to the parameters of the class of distributions. Here, we investigate this issue in the bipartite ranking framework. We design a ranking rule by aggregating estimators of the regression function. We use exponential weights based on the empirical ranking risk. Under several assumptions on the class of distribution, we show that this procedure is adaptive to the margin parameter and smoothness parameter and achieves the same rates as in the classification framework. Moreover, we state a minimax lower bound that establishes the optimality of the aggregation procedure in a specific case.
Highlights
The design of estimators that achieve optimal rates of convergence is a major topic in statistical learning
We introduce two margin assumptions in the context of bipartite ranking and we make the link with the assumption previously made
We show that the obtained decision rule satisfies an oracle inequality which can be used to achieve minimax upper bounds
Summary
The design of estimators that achieve optimal rates of convergence is a major topic in statistical learning It has been investigated in many situations such as regression, density estimation and classification. In [9], minimax rates faster than n−1/2 are achieved over class of distributions controlled by a smoothing parameter and a margin parameter They used the same estimator of the regression function as in classification but this estimator needs the knowledge of the regularity parameter. When aggregating the plug-in estimators of [9], the procedure is adaptive to the parameters of the class of distributions under the strong density assumption.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.