Markov chain Monte Carlo approaches have been widely used for Bayesian inference. The drawback of these methods is that they can be computationally prohibitive especially when complex models are analyzed. In such cases, variational methods may provide an efficient and attractive alternative. However, the variational methods reported to date are applicable to relatively simple models and most are based on a factorized approximation to the posterior distribution. Here, we propose a variational approach that is capable of handling models that consist of a system of differential-algebraic equations and whose posterior approximation can be represented by a multivariate distribution. Under the proposed approach, the solution of the variational inference problem is decomposed into three steps: a maximum a posteriori optimization, which is facilitated by using an orthogonal collocation approach, a preprocessing step, which is based on the estimation of the eigenvectors of the posterior covariance matrix, and an expected propagation optimization problem. To tackle multivariate integration, we employ quadratures derived from the Smolyak rule (sparse grids). Examples are reported to elucidate the advantages and limitations of the proposed methodology. The results are compared to the solutions obtained from a Markov chain Monte Carlo approach. It is demonstrated that significant computational savings can be gained using the proposed approach. This article has supplementary material online.