Abstract

This article proposes a new approach to modeling high-dimensional time series by treating a p-dimensional time series as a nonsingular linear transformation of certain common factors and idiosyncratic components. Unlike the approximate factor models, we assume that the factors capture all the nontrivial dynamics of the data, but the cross-sectional dependence may be explained by both the factors and the idiosyncratic components. Under the proposed model, (a) the factor process is dynamically dependent and the idiosyncratic component is a white noise process, and (b) the largest eigenvalues of the covariance matrix of the idiosyncratic components may diverge to infinity as the dimension p increases. We propose a white noise testing procedure for high-dimensional time series to determine the number of white noise components and, hence, the number of common factors, and introduce a projected principal component analysis (PCA) to eliminate the diverging effect of the idiosyncratic noises. Asymptotic properties of the proposed method are established for both fixed p and diverging p as the sample size n increases to infinity. We use both simulated data and real examples to assess the performance of the proposed method. We also compare our method with two commonly used methods in the literature concerning the forecastability of the extracted factors and find that the proposed approach not only provides interpretable results, but also performs well in out-of-sample forecasting. Supplementary materials for this article are available online.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call