In a seminal paper, Godambe [1985. The foundations of finite sample estimation in stochastic processes. Biometrika 72, 419–428.] introduced the ‘estimating function’ approach to estimation of parameters in semi-parametric models under a filtering associated with a martingale structure. Later, Godambe [1987. The foundations of finite sample estimation in stochastic processes II. Bernoulli, Vol. 2. V.N.V. Science Press, 49–54.] and Godambe and Thompson [1989. An extension of quasi-likelihood Estimation. J. Statist. Plann. Inference 22, 137–172.] replaced this filtering by a more flexible conditioning. Abraham et al. [1997. On the prediction for some nonlinear time-series models using estimating functions. In: Basawa, I.V., et al. (Eds.), IMS Selected Proceedings of the Symposium on Estimating Functions, Vol. 32. pp. 259–268.] and Thavaneswaran and Heyde [1999. Prediction via estimating functions. J. Statist. Plann. Inference 77, 89–101.] invoked the theory of estimating functions for one-step ahead prediction in time-series models. This paper addresses the problem of simultaneous estimation of parameters and multi-step ahead prediction of a vector of future random variables in semi-parametric models by extending the inimitable approach of Godambe (1985, 1987). The proposed technique is in conformity with the paradigm of the modern theory of estimating functions leading to finite sample optimality within a chosen class of estimating functions, which in turn are used to get the predictors. Particular applications of the technique give predictors that enjoy optimality properties with respect to other well-known criteria.