Abstract
We investigate hypothesis testing in nonparametric additive models estimated using simplified smooth backfitting (Huang and Yu, Journal of Computational and Graphical Statistics, 28(2), 386–400, 2019). Simplified smooth backfitting achieves oracle properties under regularity conditions and provides closed-form expressions of the estimators that are useful for deriving asymptotic properties. We develop a generalized likelihood ratio (GLR) (Fan, Zhang and Zhang, Annals of statistics, 29(1),153–193, 2001) and a loss function (LF) (Hong and Lee, Annals of Statistics, 41(3), 1166–1203, 2013)-based testing framework for inference. Under the null hypothesis, both the GLR and LF tests have asymptotically rescaled chi-squared distributions, and both exhibit the Wilks phenomenon, which means the scaling constants and degrees of freedom are independent of nuisance parameters. These tests are asymptotically optimal in terms of rates of convergence for nonparametric hypothesis testing. Additionally, the bandwidths that are well suited for model estimation may be useful for testing. We show that in additive models, the LF test is asymptotically more powerful than the GLR test. We use simulations to demonstrate the Wilks phenomenon and the power of these proposed GLR and LF tests, and a real example to illustrate their usefulness.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Annals of the Institute of Statistical Mathematics
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.