<p>Local polynomial fitting exhibits numerous compelling statistical properties, particularly within the intricate realm of multivariate analysis. However, as functional data analysis gains prominence as a dynamic and pertinent field in data science, the exigency arises for the formulation of a specialized theory tailored to local polynomial fitting. We explored the intricate task of estimating the regression function operator and its partial derivatives for stationary mixing random processes, denoted as $ (Y_i, X_i) $, using local higher-order polynomial fitting. Our key contributions include establishing the joint asymptotic normality of the estimates for both the regression function and its partial derivatives, specifically in the context of strongly mixing processes. Additionally, we provide explicit expressions for the bias and the variance-covariance matrix of the asymptotic distribution. Demonstrating uniform strong consistency over compact subsets, along with delineating the rates of convergence, we substantiated these results for both the regression function and its partial derivatives. Importantly, these findings rooted in reasonably broad conditions that underpinned the underlying models. To demonstrate practical applicability, we leveraged our results to compute pointwise confidence regions. Finally, we extended our ideas to the nonparametric conditional distribution, and obtained its limiting distribution.</p>
Read full abstract