Abstract

This paper presents a method for adapting the cost function in the Monge–Kantorovich Problem (MKP) to a classification task. More specifically, we introduce a criterion that allows to learn a cost function which tends to produce large distance values for elements belonging to different classes and small distance values for elements belonging to the same class. Under some additional constraints (one of them being the well-known Monge condition), we show that the optimization of this criterion writes as a linear programming problem. Experimental results on synthetic data show that the output optimal cost function provides good retrieval performances in the presence of two types of perturbations commonly found in histograms. When compared to a set of various commonly used cost functions, our optimal cost function performs as good as the best cost function of the set, which shows that it can adapt well to the task. Promising results are also obtained on real data for two-class image retrieval based on grayscale intensity histograms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call