Abstract

Many time series arising in practice are best considered as components of some vector-valued (multivariate) time series {X t } having not only serial dependence within each component series {X ti } but also interdependence between the different component series {X ti } and {X tj }, i ≠ j Much of the theory of univariate time series extends in a natural way to the multivariate case; however, new problems arise. In this chapter we introduce the basic properties of multivariate series and consider the multivariate extensions of some of the techniques developed earlier. In Section 7.1 we introduce two sets of bivariate time series data for which we develop multivariate models later in the chapter. In Section 7.2 we discuss the basic properties of stationary multivariate time series, namely the mean vector μ = E X t and the covariance matrices Γ(h) = E(X t+h X t ′) − μμ′, h = 0, ±1, ±2,..., with reference to some simple examples, including multivariate white noise. Section 7.3 deals with estimation of μ and Γ(•) and the question of testing for serial independence on the basis of observations of X 1,... , X n . In Section 7.4 we introduce multivariate ARMA processes. and illustrate the problem of multivariate model identification with an example of a multivariate AR(1) process that also has an MA(1) representation. (Such examples do not exist in the univariate case.) The identification problem can be avoided by confining attention to multivariate autoregressive (or VAR) models. Forecasting multivariate time series with known second-order properties is discussed in Section 7.5, and in Section 7.6 we consider the modelling and forecasting of multivariate time series using the multivariate Yule-Walker equations and Whittle’s generalization of the Durbin-Levinson algorithm. Section 7.7 contains a brief introduction to the notion of cointegrated time series.KeywordsMultivariate Time SeriesTransfer Function ModelAutocovariance FunctionUnivariate Time SeriesLarge Sample ApproximationThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call