Abstract

Let (X, Λ) be a pair of random variables, where Λ is an Ω (a compact subset of the real line) valued random variable with the density functiong(Θ: α) andX is a real-valued random variable whose conditional probability function given Λ=Θ is P {X=x|Θ} withx=x 0, x1, …. Based onn independent observations ofX, x (n), we are to estimate the true (unknown) parameter vectorα=(α 1, α2, ...,αm) of the probability function ofX, Pα(X=∫ΩP{X=x|Θ}g(Θ:α)dΘ. A least squares estimator of α is any vector\(\hat \alpha \left( {X^{\left( n \right)} } \right)\) which minimizes $$n^{ - 1} \sum\limits_{i = 1}^n {\left( {P_\alpha \left( {x_i } \right) - fn\left( {x_i } \right)} \right)^2 } $$ wherex (n)=(x1, x2,…,x n) is a random sample ofX andf n(xi)=[number ofx i inx (n)]/n. It is shown that the least squares estimators exist as a unique solution of the normal equations for all sufficiently large sample size (n) and the Gauss-Newton iteration method of obtaining the estimator is numerically stable. The least squares estimators converge to the true values at the rate of\(O\left( {\sqrt {2\log \left( {{{\log n} \mathord{\left/ {\vphantom {{\log n} n}} \right. \kern-\nulldelimiterspace} n}} \right)} } \right)\) with probability one, and has the asymptotically normal distribution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call