We consider the setting of multiscale overdamped Langevin stochastic differential equations, and study the problem of learning the drift function of the homogenized dynamics from continuous-time observations of the multiscale system. We decompose the drift term in a truncated series of basis functions, and employ the stochastic gradient descent in continuous time to infer the coefficients of the expansion. Due to the incompatibility between the multiscale data and the homogenized model, the estimator alone is not able to reconstruct the exact drift. We therefore propose to filter the original trajectory through appropriate kernels and include filtered data in the stochastic differential equation for the estimator, which indeed solves the misspecification issue. Several numerical experiments highlight the accuracy of our approach. Moreover, we show theoretically in a simplified framework the asymptotic unbiasedness of our estimator in the limit of infinite data and when the multiscale parameter describing the fastest scale vanishes.
Read full abstract