Abstract

A research area based on the application of information theory to machine learning has attracted considerable interest in the last few years. This research area has been coined information-theoretic learning within the community. In this paper we apply elements of information-theoretic learning to the problem of automatic target recognition (ATR). A number of researchers have previously shown the benefits of designing classifiers based on maximizing the mutual information between the class data and the class labels. Following prior research in information-theoretic learning, in the current results we show that quadratic mutual information, derived using a special case of the more general Renyi's entropy, can be used for classifier design. In this implementation, a simple subspace projection classifier is formulated to find the optimal projection weights such that the quadratic mutual information between the class data and the class labels is maximized. This subspace projection accomplishes a dimensionality reduction of the raw data set wherein information about the class membership is retained while irrelevant information is discarded. A subspace projection based on this criterion preserves as much class discriminability as possible within the subspace. For this paper, laser radar images are used to demonstrate the results. Classification performance against this data set is compared for a gradient descent MLP classifier and a quadratic mutual information MLP classifier.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call