ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.05263
93
23
v1v2v3v4 (latest)

Causal Balancing for Domain Generalization

10 June 2022
Xinyi Wang
Michael Stephen Saxon
Jiachen Li
Hongyang R. Zhang
Kun Zhang
William Yang Wang
    OODCML
ArXiv (abs)PDFHTML
Abstract

While machine learning models rapidly advance the state-of-the-art on various real-world tasks, out-of-domain (OOD) generalization remains a challenging problem given the vulnerability of these models to spurious correlations. We propose a causally-motivated balanced mini-batch sampling strategy to transform the observed train distribution to a balanced distribution that is free of spurious correlations. We argue that the Bayes optimal classifier trained on such balanced distribution is minimax optimal across a diverse enough environment space. We also provide an identifiability guarantee of the latent variable model of the proposed underlying data generation process with invariant causal mechanisms, by utilizing enough number of train environments. Experiments are conducted on three domain generalization datasets, demonstrating empirically that our balanced mini-batch sampling strategy improves the performance of four different established domain generalization model baselines compared to the random mini-batch sampling strategy.

View on arXiv
Comments on this paper