Abstract

Estimation theory has been developed over centuries. There are several approaches to utilizing this theory; in this chapter, these approaches are classified into three types. Type I includes the oldest two methods, the least squares (LS) and moment methods; both of these methods are non-optimal estimators. The least squares method was introduced by Carl Friedrich Gauss. Least squares problems fall into linear and non-linear categories. The linear least squares problem is also known as regression analysis in statistics, which have a closed form solution. An important feature of the least squares method is that no probabilistic assumptions of the data are made. Therefore, the linear least squares approach is used for parameter estimation, especially for low complexity design (Lin, 2008; 2009). The design goal of the least squares estimator is to find a linear function of observations whose expectation is a linear function of the unknown parameter with minimum variance. In addition, the least squares method corresponds to the maximum likelihood (ML) criterion if the experimental errors are normally distributed and can also be derived from the moment estimation. As an alternative to the LS method, the moment method is another simple parameter estimation method with probabilistic assumptions of the data. The general moment method was introduced by K. Pearson. The main procedure in the moment method involves equating the unknown parameter to a moment of distribution, then replacing the moment with a sample moment to obtain the moment estimator. Although the moment estimator has no optimal properties, the accuracy can be validated through lengthy data measurements. This is mainly because the estimator based on moment can be maintained to be consistent. Type II includes the methods of minimum variance unbiased estimator (MVUE) and the Bayesian approach, which are both optimal in terms of possible minimum estimation error, i.e., statistical efficiency. MVUE is the best guess of an unknown parameter. The standard MVUE procedure includes two steps. In the first step, the CramerRao lower bound is determined, and the ability of some estimator to approach the bound. In the second step, the Rao-Blackwell-Lehmann-Scheffe (RBLS) theorem is applied. The MVUE can be produced by these two steps. Moreover, a linear MVUE might be found under more restricted conditions. In the Bayesian method, the Bayesian philosophy begins with the cost function, and the expected cost with respect to the parameter is the risk. The design goal of Bayesian philosophy is to find an estimator that minimizes the average risk (Bayes risk). The most

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call