Abstract

Compared with local polynomial quantile regression, K nearest neighbor quantile regression (KNNQR) has many advantages, such as not assuming smoothness of functions. The paper summarizes the research of KNNQR and has carried out further research on the selection of k, algorithm and Monte Carlo simulations. Additionally, simulated functions are Blocks, Bumps, HeaviSine and Doppler, which stand for jumping, volatility, mutagenicity slope and high frequency function. When function to be estimated has some jump points or catastrophe points, KNNQR is superior to local linear quantile regression in the sense of the mean squared error and mean absolute error criteria. To be mentioned, even high frequency, the superiority of KNNQR could be observed. A real data is analyzed as an illustration.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.