ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.17243
  4. Cited By
Breaking the Memory Barrier: Near Infinite Batch Size Scaling for
  Contrastive Loss

Breaking the Memory Barrier: Near Infinite Batch Size Scaling for Contrastive Loss

22 October 2024
Zesen Cheng
Hang Zhang
Kehan Li
Sicong Leng
Zhiqiang Hu
Fei Wu
Deli Zhao
Xin Li
Lidong Bing
ArXiv (abs)PDFHTML

Papers citing "Breaking the Memory Barrier: Near Infinite Batch Size Scaling for Contrastive Loss"

2 / 2 papers shown
Title
AmorLIP: Efficient Language-Image Pretraining via Amortization
AmorLIP: Efficient Language-Image Pretraining via Amortization
Haotian Sun
Yitong Li
Yuchen Zhuang
Niao He
Hanjun Dai
Bo Dai
VLM
82
0
0
25 May 2025
A Multi-Task Foundation Model for Wireless Channel Representation Using Contrastive and Masked Autoencoder Learning
A Multi-Task Foundation Model for Wireless Channel Representation Using Contrastive and Masked Autoencoder Learning
Berkay Guler
Giovanni Geraci
Hamid Jafarkhani
73
1
0
14 May 2025
1