ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.06603
  4. Cited By
Semantically-Conditioned Negative Samples for Efficient Contrastive
  Learning

Semantically-Conditioned Negative Samples for Efficient Contrastive Learning

12 February 2021
J. Ó. Neill
Danushka Bollegala
ArXivPDFHTML

Papers citing "Semantically-Conditioned Negative Samples for Efficient Contrastive Learning"

4 / 4 papers shown
Title
Ranking Info Noise Contrastive Estimation: Boosting Contrastive Learning
  via Ranked Positives
Ranking Info Noise Contrastive Estimation: Boosting Contrastive Learning via Ranked Positives
David T. Hoffmann
Nadine Behrmann
Juergen Gall
Thomas Brox
M. Noroozi
41
43
0
27 Jan 2022
ConvFiT: Conversational Fine-Tuning of Pretrained Language Models
ConvFiT: Conversational Fine-Tuning of Pretrained Language Models
Ivan Vulić
Pei-hao Su
Sam Coope
D. Gerz
Paweł Budzianowski
I. Casanueva
Nikola Mrkvsić
Tsung-Hsien Wen
27
36
0
21 Sep 2021
Improved Baselines with Momentum Contrastive Learning
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
281
3,378
0
09 Mar 2020
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Zhuowen Tu
Kaiming He
309
10,233
0
16 Nov 2016
1