ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.03531
23
1

Learning to Transfer with von Neumann Conditional Divergence

7 August 2021
Ammar Shaker
Shujian Yu
Daniel Oñoro-Rubio
    OOD
    DRL
ArXivPDFHTML
Abstract

The similarity of feature representations plays a pivotal role in the success of problems related to domain adaptation. Feature similarity includes both the invariance of marginal distributions and the closeness of conditional distributions given the desired response yyy (e.g., class labels). Unfortunately, traditional methods always learn such features without fully taking into consideration the information in yyy, which in turn may lead to a mismatch of the conditional distributions or the mix-up of discriminative structures underlying data distributions. In this work, we introduce the recently proposed von Neumann conditional divergence to improve the transferability across multiple domains. We show that this new divergence is differentiable and eligible to easily quantify the functional dependence between features and yyy. Given multiple source tasks, we integrate this divergence to capture discriminative information in yyy and design novel learning objectives assuming those source tasks are observed either simultaneously or sequentially. In both scenarios, we obtain favorable performance against state-of-the-art methods in terms of smaller generalization error on new tasks and less catastrophic forgetting on source tasks (in the sequential setup).

View on arXiv
Comments on this paper