Abstract
Domain adaptation aims to use training data from one or multiple source domains to learn a hypothesis that can be generalized to a different, but related, target domain. As such, having a reliable measure for evaluating the discrepancy of both marginal and conditional distributions is crucial. We introduce Cauchy-Schwarz (CS) divergence to the problem of unsupervised domain adaptation (UDA). The CS divergence offers a theoretically tighter generalization error bound than the popular Kullback-Leibler divergence. This holds for the general case of supervised learning, including multi-class classification and regression. Furthermore, we illustrate that the CS divergence enables a simple estimator on the discrepancy of both marginal and conditional distributions between source and target domains in the representation space, without requiring any distributional assumptions. We provide multiple examples to illustrate how the CS divergence can be conveniently used in both distance metric- or adversarial training-based UDA frameworks, resulting in compelling performance. The code of our paper is available at https://github.com/ywzcode/CS-adv.
| Original language | English |
|---|---|
| Pages (from-to) | 4011-4040 |
| Number of pages | 30 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 244 |
| Publication status | Published - 2024 |
| Event | 40th Conference on Uncertainty in Artificial Intelligence, UAI 2024 - Barcelona, Spain Duration: 15 Jul 2024 → 19 Jul 2024 |
Bibliographical note
Publisher Copyright:© 2024 Proceedings of Machine Learning Research. All rights reserved.
Fingerprint
Dive into the research topics of 'Domain Adaptation with Cauchy-Schwarz Divergence'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver