Domain Adaptation with Cauchy-Schwarz Divergence

  • Wenzhe Yin
  • , Shujian Yu*
  • , Yicong Lin
  • , Jie Liu
  • , Jan-Jakob Sonke
  • , Efstratios Gavves
  • *Corresponding author for this work

Research output: Contribution to JournalArticleAcademicpeer-review

10 Downloads (Pure)

Abstract

Domain adaptation aims to use training data from one or multiple source domains to learn a hypothesis that can be generalized to a different, but related, target domain. As such, having a reliable measure for evaluating the discrepancy of both marginal and conditional distributions is crucial. We introduce Cauchy-Schwarz (CS) divergence to the problem of unsupervised domain adaptation (UDA). The CS divergence offers a theoretically tighter generalization error bound than the popular Kullback-Leibler divergence. This holds for the general case of supervised learning, including multi-class classification and regression. Furthermore, we illustrate that the CS divergence enables a simple estimator on the discrepancy of both marginal and conditional distributions between source and target domains in the representation space, without requiring any distributional assumptions. We provide multiple examples to illustrate how the CS divergence can be conveniently used in both distance metric- or adversarial training-based UDA frameworks, resulting in compelling performance. The code of our paper is available at https://github.com/ywzcode/CS-adv.
Original languageEnglish
Pages (from-to)4011-4040
Number of pages30
JournalProceedings of Machine Learning Research
Volume244
Publication statusPublished - 2024
Event40th Conference on Uncertainty in Artificial Intelligence, UAI 2024 - Barcelona, Spain
Duration: 15 Jul 202419 Jul 2024

Bibliographical note

Publisher Copyright:
© 2024 Proceedings of Machine Learning Research. All rights reserved.

Fingerprint

Dive into the research topics of 'Domain Adaptation with Cauchy-Schwarz Divergence'. Together they form a unique fingerprint.

Cite this