54
0

Geometrically Regularized Transfer Learning with On-Manifold and Off-Manifold Perturbation

Main:7 Pages
Bibliography:1 Pages
Abstract

Transfer learning under domain shift remains a fundamental challenge due to the divergence between source and target data manifolds. In this paper, we propose MAADA (Manifold-Aware Adversarial Data Augmentation), a novel framework that decomposes adversarial perturbations into on-manifold and off-manifold components to simultaneously capture semantic variation and model brittleness. We theoretically demonstrate that enforcing on-manifold consistency reduces hypothesis complexity and improves generalization, while off-manifold regularization smooths decision boundaries in low-density regions. Moreover, we introduce a geometry-aware alignment loss that minimizes geodesic discrepancy between source and target manifolds. Experiments on DomainNet, VisDA, and Office-Home show that MAADA consistently outperforms existing adversarial and adaptation methods in both unsupervised and few-shot settings, demonstrating superior structural robustness and cross-domain generalization.

View on arXiv
@article{satou2025_2505.15191,
  title={ Geometrically Regularized Transfer Learning with On-Manifold and Off-Manifold Perturbation },
  author={ Hana Satou and Alan Mitkiy and F Monkey },
  journal={arXiv preprint arXiv:2505.15191},
  year={ 2025 }
}
Comments on this paper