Abstract
We study minimax lower bounds for function estimation problems on large graph when the target function is smoothly varying over the graph. We derive minimax rates in the context of regression and classification problems on graphs that satisfy an asymptotic shape assumption and with a smoothness condition on the target function, both formulated in terms of the graph Laplacian.
Highlights
In recent years there has been substantial interest in high-dimensional estimation and prediction problems on large graphs. These can in many cases be seen as high-dimensional or nonparametric regression or classification problems in which the goal is to learn a “smooth” function on a given graph
We introduce a Sobolev-type smoothness condition on the target function using the graph Laplacian again to quantify smoothness
We derive our results under an asymptotic geometry assumption on the graph, first introduced in Kirichenko and van Zanten (2017), formulated in terms of the Laplacian eigenvalues
Summary
In recent years there has been substantial interest in high-dimensional estimation and prediction problems on large graphs. Convergence rates have been obtained by Sadhanala et al (2016) in the context of regression on a regular grid using total variation penalties and by Kirichenko and van Zanten (2017) for nonparametric Bayes procedures for regression and classification on more general graphs. We have chosen our setup and normalisations in such a way that the optimal rates over balls of smooth functions that we obtain are of the usual form n−β/(r+2β) This shows that the geometry parameter r can be interpreted as some kind of “dimension” of the graph. The lower bounds match the upper bounds we obtained in Kirichenko and van Zanten (2017) This shows that the nonparametric Bayes procedures we proposed in the latter paper are smoothness-adaptive and rate-optimal.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have