Abstract

We continue the research begun in 1975 on structured estimation. The original work in 1976 by Morgera and Cooper dealt with the Gaussian two-category classification problem when the common covariance matrix is unknown and must be estimated in order to approximate the hyperplane for decisionmaking, which is optimum for the true covariance matrix. We formulate the probability density function (pd0 estimation problem as a multivariate extension of the Rosenblatt-Parzen kernel method in which the multivariate characteristic function (cf) is estimated. A Gaussian form is assumed for the underlying probability distribution, and two methods are presented for the estimation of the covariance matrix in the cf: 1) a maximum-likelihood (MLE) general sample covariance matrix estimate, and 2) a constrained Toeplitz form estimate which takes full advantage of the structure imposed by weak stationarity of the underlying probability distribution. It is shown that both resulting cf estimates are asymptotically unbiased and consistent, albeit the structured covariance matrix estimate is itself only a {\em first approximation to the MLE} and may not be positive definite. It is, however, apparently this difference in the estimators which gives rise to a considerable difference in finite sample sire performance. Typical calculations show that the effective sample size increase of the structured estimate can be considerable, a fact very important in nonparametric problems in which data are limited, or in which the sample size-to-dimensionality ratio is small. Applications of this research to the areas of nonparametric pattern recognition and communications theory are discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call