77
0

Contrastive Flow Matching

Abstract

Unconditional flow-matching trains diffusion models to transport samples from a source distribution to a target distribution by enforcing that the flows between sample pairs are unique. However, in conditional settings (e.g., class-conditioned models), this uniqueness is no longer guaranteed--flows from different conditions may overlap, leading to more ambiguous generations. We introduce Contrastive Flow Matching, an extension to the flow matching objective that explicitly enforces uniqueness across all conditional flows, enhancing condition separation. Our approach adds a contrastive objective that maximizes dissimilarities between predicted flows from arbitrary sample pairs. We validate Contrastive Flow Matching by conducting extensive experiments across varying model architectures on both class-conditioned (ImageNet-1k) and text-to-image (CC3M) benchmarks. Notably, we find that training models with Contrastive Flow Matching (1) improves training speed by a factor of up to 9x, (2) requires up to 5x fewer de-noising steps and (3) lowers FID by up to 8.9 compared to training the same models with flow matching. We release our code at:this https URL.

View on arXiv
@article{stoica2025_2506.05350,
  title={ Contrastive Flow Matching },
  author={ George Stoica and Vivek Ramanujan and Xiang Fan and Ali Farhadi and Ranjay Krishna and Judy Hoffman },
  journal={arXiv preprint arXiv:2506.05350},
  year={ 2025 }
}
Comments on this paper