The information coefficient (IC), defined as the correlation coefficient between a stock return and its factor exposures predictor variables, is one of the most commonly used statistics in quantitative financial analysis. In this paper, we establish consistency and asymptotic normality of the time series average of cross-sectional sample ICs when the true underlying ICs between the risk-adjusted residual return and the standardized factor exposures are time varying. We use those results to show that the time series average of the cross-sectional sample ICs divided by its sample standard deviation converges to the ex ante expected portfolio information ratio (IR) as derived in Ding and Martin (2017). A simulation study based on a true factor model shows that the finite sample results are strikingly close to what the theory suggests. We also conduct empirical simulations using actual stock returns and quantitative factor exposures, and we find that the logarithm of the estimated IR can be explained very well by a function of the IC mean, the IC standard deviation, and the sample size in exactly the same way as predicted by our theory built on a linear factor model with time varying ICs.
Read full abstract