Abstract

This paper has two main focal points. We first consider an important class of machine learning algorithms - large margin classifiers, such as support vector machines. The notion of margin complexity quantifies the extent to which a given class of functions can be learned by large margin classifiers. We prove that up to a small multiplicative constant, margin complexity is equal to the inverse of discrepancy. This establishes a strong tie between seemingly very different notions from two distinct areas. In the same way that matrix rigidity is related to rank, we introduce the notion of rigidity of margin complexity. We prove that sign matrices with small margin complexity rigidity are very rare. This leads to the question of proving lower bounds on the rigidity of margin complexity. Quite surprisingly, this question turns out to be closely related to basic open problems in communication complexity, e.g., whether PSPACE can be separated from the polynomial hierarchy in communication complexity. There are numerous known relations between the field of learning theory and that of communication complexity, as one might expect since communication is an inherent aspect of learning. The results of this paper constitute another link in this rich web of relations. This link has already proved significant as it was used in the solution of a few open problems in communication complexity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call