- Research Article
5
- 10.1093/ectj/utae003
- Jan 27, 2024
- The Econometrics Journal
- Kirill Borusyak + 2 more
Summary Many studies in economics use instruments or treatments that combine a set of exogenous shocks with other predetermined variables via a known formula. Examples include shift-share instruments and measures of social or spatial spillovers. We review recent econometric tools for this setting, which leverage the assignment process of the exogenous shocks and the structure of the formula for identification. We compare this design-based approach with conventional estimation strategies based on conditional unconfoundedness, and contrast it with alternative strategies that leverage a model for unobservables.
- Research Article
12
- 10.1093/ectj/utae002
- Jan 16, 2024
- The Econometrics Journal
- Jörg Breitung + 2 more
Summary This paper studies the asymptotic properties of endogeneity corrections based on nonlinear transformations without external instruments, which were originally proposed by Park and Gupta (2012) and have become popular in applied research. In contrast to the original copula-based estimator, our approach is based on a nonparametric control function and does not require a conformably specified copula. Moreover, we allow for exogenous regressors, which may be (linearly) correlated with the endogenous regressor(s). We establish consistency, asymptotic normality, and validity of the bootstrap for the unknown model parameters. An empirical application on wage data of the US Current Population Survey demonstrates the usefulness of the method.
- Research Article
- 10.1093/ectj/utae001
- Jan 12, 2024
- The Econometrics Journal
- Qiang Liu + 1 more
Summary Jumps and market microstructure noise are stylized features of high-frequency financial data. It is well known that they introduce bias in the estimation of volatility (including integrated and spot volatilities) of assets, and many methods have been proposed to deal with this problem. When the jumps are intensive with infinite variation, the efficient estimation of spot volatility under serially dependent noise is not available and is thus in need. For this purpose, we propose a novel estimator of spot volatility with a hybrid use of the pre-averaging technique and the empirical characteristic function. Under mild assumptions, the results of consistency and asymptotic normality of our estimator are established. Furthermore, we show that our estimator achieves an almost efficient convergence rate with optimal variance when the jumps are either less active or active with symmetric structure. Simulation studies verify our theoretical conclusions. We apply our proposed estimator to empirical analyses, such as estimating the weekly volatility curve using second-by-second transaction price data.
- Research Article
3
- 10.1093/ectj/utad028
- Dec 27, 2023
- The Econometrics Journal
- Arthur Lewbel + 2 more
Summary We consider peer effect estimation in social network models where some network links are incorrectly measured. We show that if the number or magnitude of mismeasured links does not grow too quickly with the sample size, then standard instrumental variables estimators that ignore these measurement errors remain consistent, and standard asymptotic inference methods remain valid. These results hold even when the link measurement errors are correlated with regressors or with structural errors in the model. Simulations and real data experiments confirm our results in finite samples. These findings imply that researchers can ignore small numbers of mismeasured links in networks.
- Research Article
6
- 10.1093/ectj/utad027
- Dec 21, 2023
- The Econometrics Journal
- Ilya Archakov + 2 more
SummaryWe propose a new method for generating random correlation matrices that makes it simple to control both location and dispersion. The method is based on a vector parameterization, $\gamma =g(C)$, which maps any distribution on $\mathbb {R}^{n(n-1)/2}$ to a distribution on the space of nonsingular $n\times n$ correlation matrices. Correlation matrices with certain properties, such as being well-conditioned, having block structures, and having strictly positive elements, are simple to generate. We compare the new method with existing methods.
- Research Article
- 10.1093/ectj/utad025
- Nov 28, 2023
- The Econometrics Journal
- Alexander Klein + 1 more
Summary This paper proposes an estimator which combines spatial differencing with a two-step sample selection estimator. We derive identification, estimation, and inference results from ‘site-specific’ unobserved effects. These effects operate at a spatial scale that cannot be captured by administrative borders. Therefore, we use spatial differencing. We show that under justifiable assumptions, the estimator is consistent and asymptotically normal. A Monte Carlo experiment illustrates the small sample properties of our estimator. We apply our procedure to the estimation of a female wage offer equation in the United States and the results show the relevance of spatial differencing to account for ‘site-specific’ unobserved effects.
- Research Article
- 10.1093/ectj/utad026
- Nov 24, 2023
- The Econometrics Journal
- Yanglin Li + 3 more
Summary This paper proposes a new test for unit root processes with a partial quadratic trend on an unknown break date, denoted as the URQ process herein. Such a process is extremely similar to the explosive bubble process, and both can capture the sharp rise in prices. We develop the asymptotic distributions under the local-to-unity hypothesis, which covers the URQ null and explosive root alternatives. Simulations show that the test has good finite sample performances and can differentiate explosive bubble processes from URQ processes. An application to the Kweichow Moutai and Apple stocks, which exhibit striking price rises during their respective sample periods, shows that both prices follow URQ processes. We further provide a fundamental analysis. The significant increases in earnings, returns, dividends, and fundamental score after the partial quadratic trend occurs provide evidence that a fundamental improvement rather than a bubble mainly drives such drastic price rises.
- Research Article
2
- 10.1093/ectj/utad024
- Nov 15, 2023
- The Econometrics Journal
- Serena Ng + 1 more
Summary Monthly and weekly economic indicators are often taken to be the largest common factor estimated from high and low frequency data, either separately or jointly. To incorporate mixed frequency information without directly modelling them, we target a low frequency diffusion index that is already available, and treat high frequency values as missing. We impute these values using multiple factors estimated from the high frequency data. In the empirical examples considered, static matrix completion that does not account for serial correlation in the idiosyncratic errors yields imprecise estimates of the missing values irrespective of how the factors are estimated. Single equation and systems-based dynamic procedures that account for serial correlation yield imputed values that are closer to the observed low frequency ones. This is the case in the counterfactual exercise that imputes the monthly values of consumer sentiment series before 1978 when the data was released only on a quarterly basis. This is also the case for a weekly version of the Chicago Fed National Activity Index of economic activity that is imputed using seasonally unadjusted data. The imputed series reveals episodes of increased variability of weekly economic information that are masked by the monthly data, notably around the 2014–2015 collapse in oil prices.
- Research Article
3
- 10.1093/ectj/utad023
- Oct 23, 2023
- The Econometrics Journal
- Gianluca Cubadda + 1 more
SummaryThis paper extends the multivariate index autoregressive model to the case of cointegrated time series of order (1,1). In this new modelling, namely the vector error-correction index model (VECIM), the first differences of series are driven by some linear combinations of the variables, namely the indexes. When the indexes are significantly fewer than the variables, the VECIM achieves a substantial dimension reduction with reference to the vector error correction model. We show that the VECIM allows one to decompose the reduced-form errors into sets of common and uncommon shocks, and that the former can be further decomposed into permanent and transitory shocks. Moreover, we offer a switching algorithm for optimal estimation of the VECIM. Finally, we document the practical value of the proposed approach by both simulations and an empirical application, where we search for the shocks that drive the aggregate fluctuations at different frequency bands in the US.
- Research Article
3
- 10.1093/ectj/utad019
- Oct 9, 2023
- The Econometrics Journal
- Rahul Singh + 1 more
Summary We propose a semi-parametric test to evaluate (a) whether different instruments induce subpopulations of compliers with the same observable characteristics, on average; and (b) whether compliers have observable characteristics that are the same as the full population, treated subpopulation, or untreated subpopulation, on average. The test is a flexible robustness check for the external validity of instruments. To justify the test, we characterise the doubly robust moment for Abadie’s class of complier parameters, and we analyse a machine learning update to weighting that we call the automatic $\kappa$ weight. We use the test to reinterpret Angrist and Evans' different local average treatment effect estimates obtained using different instrumental variables.