Abstract

Multi-instance multi-label learning (MIML) is a new machine learning framework where one data object is described by multiple instances and associated with multiple class labels. During the past few years, many MIML algorithms have been developed and many applications have been described. However, there lacks theoretical exploration to the learnability of MIML. In this paper, through proving a generalization bound for multi-instance single-label learner and viewing MIML as a number of multi-instance single-label learning subtasks with the correlation among the labels, we show that the MIML hypothesis class constructed from a multi-instance single-label hypothesis class is PAC-learnable.

Highlights

  • Multi-instance multi-label learning (MIML) is a new machine learning framework where one data object is described by multiple instances and associated with multiple class labels

  • In contrast to traditional supervised learning where one data object is represented by one instance and associated with one class label, in MIML one object is described by multiple instances and associated with multiple class labels (Figure 1)

  • Our result shows that the MIML hypothesis class constructed from a multi-instance single-label hypothesis class is PAC-learnable

Read more

Summary

Preliminaries

Multi-instance single-label learning, called multiinstance learning [10, 11], can be viewed as a degenerated version of MIML where the most labels associated with a bag are neglected and only one label is concerned. Let φn(H ) = {φhn|h ∈ H} denote the hypothesis class over bags generated from H by φ, dI denote the finite VC-dimension of H and dB denote the finite VC-dimension of φn(H ), based on the results in [9] it is easy to get eqs. Sabato and Tishby [9] proposed a multi-instance single-label learning algorithm Misl. The following Lemma 1 shows that under certain condition the Misl algorithm outputs an approximation to the optimal edge on the input bag sample [9]. Let H denote the hypothesis class where h ∈ H is a mapping from X to {−1, +1}, hM be the hypothesis returned by algorithm Misl when receiving S B as input, ω Λ(hM, S B) and ω∗ maxh∈H Λ(φhn, S B).

Multi-instance single-label generalization bound
MIML learnability
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call