Abstract

In previous chapters, we presented several seminal results regarding the generalization bounds for domain adaptation and the impossibility theorems for some of them. We have shown that the basic shape of generalization bounds in the context of domain adaptation remains more or less the same and principally differs only in the divergence used to measure the distance between the source and the target marginal distributions. To this end, we showed that the first original bound proposed by Ben-David et al. is restricted to a particular 0 − 1 loss function and a symmetric hypothesis class: a drawback that was tackled by Mansour et al with the introduction of the discrepancy distance. Despite the differences between the two, however, they both have several common features, which is an explicit dependence on the considered hypothesis class and the computational issues related to their computation. Indeed, the computation of the ℋΔℋ-divergence presents an intractable problem, while the minimization of the discrepancy distance has a prohibitive computational complexity. To this end, a natural quest for other metrics with some attractive properties suitable for quantifying the divergence between the two domains arises. In this chapter, we consider a large family of metrics on the space of probability measure called integral probability metrics (IPMs) that present a well-studied topic in probability theory. We particularly show that depending on the chosen functional class, some instances of IPMs can have interesting properties that are completely different from those exhibited by both ℋΔℋ-divergence and discrepancy distance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call