Abstract

This paper deals simultaneously with linear structural and functional errors-in-variables models (SEIVM and FEIVM), revisiting in this context the ordinary least squares estimators (LSE) for the slope and intercept of the corresponding simple linear regression. It has been known that, subject to some model conditions, these estimators become weakly and strongly consistent in the linear SEIVM and FEIVM with the measurement errors having finite variances when the explanatory variables have an infinite variance in the SEIVM, and a similar infinite spread in the FEIVM, while otherwise, the LSE’s require an adjustment for consistency with the so-called reliability ratio. In this paper, weak and strong consistency, with and without the possible rates of convergence being determined, is proved for the LSE’s of the slope and intecept, assuming that the measurement errors are in the domain of attraction of the normal law (DAN) and thus are, for the first time, allowed to have infinite variances. Moreover, these results are obtained under the conditions that the explanatory variables are in DAN, have an infinite variance, and dominate the measurement errors in terms of variation in the SEIVM, and under appropriately matching versions of these conditions in the FEIVM. This duality extends a previously known interplay between SEIVM’s and FEIVM’s.

Highlights

  • In an SEIVM the explanatory variables ξi are assumed to be independent identically distributed (i.i.d.) random variables (r.v.’s) that are independent of the error terms, while in case of an FEIVM, one treats them as deterministic variables

  • Unlike in the traditional model with 0 < Var ξ < ∞, the least squares estimators (LSE)’s do not require any adjustments for consistency if Var ξ = ∞, when one can formally put kξ := 1. This can be interpreted as follows: the impact of the finite variance measurement errors εi in the observables xi is negligible as compared to that of the infinite variance explanatory variables ξi, so much so that the model becomes close in spirit to, and behaves as if it were, the simple linear regression yi = βxi + α + δi, 1 ≤ i ≤ n

  • In Martsynyuk (2005, 2007b, 2009), simultaneously with SEIVM (1.1) with ξ ∈ DAN, we studied FEIVM (1.1) and established new asymptotics in it under the conditions on the deterministic explanatory variables that match the condition ξ ∈ DAN, and are new and most general in the context

Read more

Summary

Introduction and main results

Linear structural and functional errors-in-variables models ( SEIVM and FEIVM). Where (yi, xi) ∈ Ê2 are vectors of observations, ξi are unknown explanatory/latent variables, the real-valued slope β and intercept α are to be estimated, and δi and εi are unknown measurement error terms/variables, 1 ≤ i ≤ n, n ∈ Æ. EIVM (1.1) is known as a measurement error model, or regression with errors in variables It is a generalization of the simple linear regression of the form yi = βξi + α + δi in that in (1.1) it is assumed that, in addition to the two variables η := βξ + α and ξ being linearly related, η, and ξ, are observed with respective measurement errors δi and εi. 0, if the intercept α is known to be zero, 1, if the intercept α is unknown

Least squares estimators for the slope and intercept in SEIVM’s
Least squares estimators in FEIVM’s
Model assumptions and introduction to main results
Main results with remarks
Auxiliary results
Proofs of the main results
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call