Abstract

The Gauss mixture model (GMM)-based vector quantizer with the quadratic discriminant analysis (QDA) distortion measure provides an approach to statistical image classification problems. Recent work has concentrated on designing tree-structured vector quantizers for image classification problems using the QDA distortion measure and the BFOS algorithm for pruning. It has been shown that the tree-structured design often increases the correct classification rate for the same design complexity, avoids over-fitting by pruning and makes it possible to include other classification algorithms such as adaptive boosting. Both the full-search design and the tree-structured design are based on clustering using the Lloyd algorithm. Even when the true underlying distribution of the feature vectors follows (approximately) a Gauss mixture distribution, the variances of the Gaussian components estimated by the clustering algorithm tend to be less than those of the true distribution. Hence, clustering introduces a variance bias. The work reported here intends to reduce the effects of the variance bias using the independent central limit theorem when the feature vectors are formed as (weighted) sums of the image block pixels. This is done through a joint quantization of the means and covariances of the image blocks and the feature vectors derived from the image blocks. Our simulations indicate that, both for the full-search design and the tree-structured design, our algorithm leads to an improvement in the classification accuracy. Finally, for the tree-structured classifier, we introduce a fast algorithm, which uses only the median eigenvalue of the covariance matrix (instead of the full covariance matrix) of each Gaussian component in the classification stage.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call