Abstract

The “locally linear reconstruction” (LLR) provides a principled and k-insensitive way to determine the weights of k-nearest neighbor (k-NN) learning. LLR, however, does not provide a confidence interval for the k neighbors-based reconstruction of a query point, which is required in many real application domains. Moreover, its fixed linear structure makes the local reconstruction model unstable, resulting in performance fluctuation for regressions under different k values. Therefore, we propose a probabilistic local reconstruction (PLR) as an extended version of LLR in the k-NN regression. First, we probabilistically capture the reconstruction uncertainty by incorporating Gaussian regularization prior into the reconstruction model. This prevents over-fitting when there are no informative neighbors in the local reconstruction. We then project data into a higher dimensional feature space to capture the non-linear relationship between neighbors and a query point when a value of k is large. Preliminary experimental results demonstrated that the proposed Bayesian kernel treatment improves accuracy and k-invariance. Moreover, from the experiment on a real virtual metrology data set in the semiconductor manufacturing, it was found that the uncertainty information on the prediction outcomes provided by PLR supports more appropriate decision making.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.