Abstract

This paper focuses on an important direction in modern data analysis: functional data regression. Functional data is typically infinite-dimensional, and this paper specifically examines the case where the predictor variable is functional data and the response variable is a continuous scalar. Based on this, the paper propose a regression method that combines kernel localized and regularized functional sliced inverse regression with Bayesian model averaging. The innovation of this method lies in two aspects: firstly, compared to traditional sliced inverse regression, it captures more detailed local information through kernel clustering, providing better estimates of the reduced subspace when the number of slices is small; secondly, model averaging effectively balances the bias and variance of the prediction model, avoiding overfitting or underfitting issues. Empirical results show that the proposed method has smaller errors in terms of mean squared error and absolute error compared to some classical methods, and it demonstrates a certain level of robustness. Finally, this paper explore the potential application of the kernel clustering concept in other dimension reduction approaches based on covariance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call