ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.10575
31
1

EMC2^22: Efficient MCMC Negative Sampling for Contrastive Learning with Global Convergence

16 April 2024
Chung-Yiu Yau
Hoi-To Wai
Parameswaran Raman
Soumajyoti Sarkar
Mingyi Hong
ArXivPDFHTML
Abstract

A key challenge in contrastive learning is to generate negative samples from a large sample set to contrast with positive samples, for learning better encoding of the data. These negative samples often follow a softmax distribution which are dynamically updated during the training process. However, sampling from this distribution is non-trivial due to the high computational costs in computing the partition function. In this paper, we propose an Efficient Markov Chain Monte Carlo negative sampling method for Contrastive learning (EMC2^22). We follow the global contrastive learning loss as introduced in SogCLR, and propose EMC2^22 which utilizes an adaptive Metropolis-Hastings subroutine to generate hardness-aware negative samples in an online fashion during the optimization. We prove that EMC2^22 finds an O(1/T)\mathcal{O}(1/\sqrt{T})O(1/T​)-stationary point of the global contrastive loss in TTT iterations. Compared to prior works, EMC2^22 is the first algorithm that exhibits global convergence (to stationarity) regardless of the choice of batch size while exhibiting low computation and memory cost. Numerical experiments validate that EMC2^22 is effective with small batch training and achieves comparable or better performance than baseline algorithms. We report the results for pre-training image encoders on STL-10 and Imagenet-100.

View on arXiv
Comments on this paper