Abstract

We give a law of large deviations (LLD) for LS estimator θ̂ in a nonlinear regression model with dependent errors, i.e., an exponential inequality for the probability of a large deviation of θ̂ from the true θ, the LLD is as nice as in Sieders and Dzhaparidze (1987) which has independent errors. This generalizes the results in Sieders and Dzhaparidze (1987) and Prakasa Rao (1984).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call