Abstract

This paper proposes a novel prediction approach for a semi-functional linear model comprising a functional and a nonparametric component. The study establishes the minimax optimal rates of convergence for this model, revealing that the functional component can be learned with the same minimax rate as if the nonparametric component were known and vice versa. This result can be achieved by using a double-penalized least squares method to estimate both the functional and nonparametric components within the framework of reproducing kernel Hilbert spaces. Thanks to the representer theorem, the approach also offers other desirable features, including the algorithm efficiency requiring no iterations. We also provide numerical studies to demonstrate the effectiveness of the method and validate the theoretical analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call