Abstract

Various linear or nonlinear function-on-function (FOF) regression models have been proposed to study the relationship between functional variables, where certain forms are assumed for the relationship. However, because functional variables take values in infinite-dimensional spaces, the relationships between them can be much more complicated than those between scalar variables. The forms in existing FOF models may not be enough to cover a wide variety of relationships between functional variables, and hence the applicability of these models can be limited. We consider a general nonlinear FOF regression model without any specific assumption on the model form. To fit the model, inspired by the universal approximation theorem for the neural networks with “arbitrary width,” we develop a functional universal approximation theorem which asserts that a wide range of general maps between functional variables can be approximated with arbitrary accuracy by members in our proposed family of maps. This family is “fully” functional in that the complexity of the maps within the family is completely determined by the smoothness of the component functions in the map. With this functional universal approximation theorem, we develop a novel method to fit the general nonlinear FOF regression model, which includes all existing FOF models as special cases. The complexity of the fitted model is controlled by smoothness regularization, without the necessity to choose the number of hidden neurons. Supplemental materials for code and additional information are available online.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call