Abstract

AbstractNeural networks are routinely used for nonparametric regression modeling. The interest in these models is growing with ever‐increasing complexities in modern datasets. With modern technological advancements, the number of predictors frequently exceeds the sample size in many application areas. Thus, selecting important predictors from the huge pool is an extremely important task for judicious inference. This paper proposes a novel flexible class of single‐layer radial basis functions (RBF) networks. The proposed architecture can estimate smooth unknown regression functions and also perform variable selection. We primarily focus on Gaussian RBF‐net due to its attractive properties. The extensions to other choices of RBF are fairly straightforward. The proposed architecture is also shown to be effective in identifying relevant predictors in a low‐dimensional setting using the posterior samples without imposing any sparse estimation scheme. We develop an efficient Markov chain Monte Carlo algorithm to generate posterior samples of the parameters. We illustrate the proposed method's empirical efficacy through simulation experiments, both in high and low dimensional regression problems. The posterior contraction rate is established with respect to empirical distance assuming that the error variance is unknown, and the true function belongs to a Hölder ball. We illustrate our method in a Human Connectome Project dataset to predict vocabulary comprehension and to identify important edges of the structural connectome.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call