The problem of recovery of an unknown regression function f(x), x e 1 from noisy data is considered. The function f(-) is assumed to belong to a class of functions analytic in a strip of the complex plane around the real axis. The performance of an estimator is measured either by its deviation at a fixed point, or by its maximal error in the La-norm over a bounded interval. It is shown that in the case of equidistant observations, with an increasing design density, asymptotically minimax estimators of the unknown regression function can be found within the class of linear estimators. Such best linear estimators are explicitly obtained. Nonparametric regression models are widely studied in the statistical literature, since they are mathematically attractive and have many useful applications. Recent results by Ibragimov and Hasminskii (1981; 1982a; 1984a) and Stone (1980; 1982) marked a new approach to these models, with the emphasis on optimal (minimax) rates of convergence in estimating an unknown regression function in various functional classes. In some remarkable cases not only minimax rates of convergence, but also exact asymptotic constants have been found, and the corresponding asymptotically minimax estimators have been derived. Pinsker (1980) was the first to do this in the problem of regression estimation in continuous-time Gaussian white noise. He obtained asymptotically minimax estimators of the regression function in the L2-norm, with the underlying functional classes defined as ellipsoids in L2. These classes include as special cases Sobolev's classes as well as the classes of periodic analytic functions. In the ensuing papers of Nussbaum (1985) and Golubev and Nussbaum (1992), this study was extended to regression models in discrete time and Sobolev's classes of functions. Another example of asymptotically efficient nonparametric regression estimators was given by Ibragimov and Hasminskii (1984b) for the problem of estimating the unknown