Abstract

This paper presents a method for adapting the cost function in the Monge–Kantorovich Problem (MKP) to a classification task. More specifically, we introduce a criterion that allows to learn a cost function which tends to produce large distance values for elements belonging to different classes and small distance values for elements belonging to the same class. Under some additional constraints (one of them being the well-known Monge condition), we show that the optimization of this criterion writes as a linear programming problem. Experimental results on synthetic data show that the output optimal cost function provides good retrieval performances in the presence of two types of perturbations commonly found in histograms. When compared to a set of various commonly used cost functions, our optimal cost function performs as good as the best cost function of the set, which shows that it can adapt well to the task. Promising results are also obtained on real data for two-class image retrieval based on grayscale intensity histograms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.