Abstract

This paper discusses different kinds of dependency. For numerically valued variables our discussion centers on the maximal correlation coefficient and its cousin the monotone correlation coefficient. We show how to calculate the maximal correlation coefficient in the case the random variables take on a finite set of values. For non-numerically valued variables our discussion centers on information theoretic measures related to mutual information and we describe some that are also metrics. We visually illustrate the difference between these two kinds of measures with a texture example that computes the joint probability image: an image in which the gray level of each pixel is the joint probability of the gray levels of the pixels in its neighborhood. Neighborhoods can be regular such as 5 × 5 or they can be irregular. Finally, we discuss manifold methods for classification: the N-tuple method, the subspace classifiers, the subspace ensemble classifiers, including the graphical model for representing the class conditional probability distribution. We describe a procedure to convert an N-tuple classifier to a graphical model classifier. We also conjecture that there is new form of a universal approximation theorem by which not too complex classification functions from measurement space to the set of classes can be approximately represented in the form of a subspace classifier using multiple subspaces such as the N-tuple method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call