ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.12982
  4. Cited By
A Primer on Contrastive Pretraining in Language Processing: Methods,
  Lessons Learned and Perspectives

A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned and Perspectives

25 February 2021
Nils Rethmeier
Isabelle Augenstein
    SSL
    VLM
ArXivPDFHTML

Papers citing "A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned and Perspectives"

18 / 18 papers shown
Title
Low Fidelity Visuo-Tactile Pretraining Improves Vision-Only Manipulation Performance
Low Fidelity Visuo-Tactile Pretraining Improves Vision-Only Manipulation Performance
Selam Gano
Abraham George
A. Farimani
OnRL
40
1
0
21 Jun 2024
Enhancing Context Through Contrast
Enhancing Context Through Contrast
Kshitij Ambilduke
Aneesh Shetye
Diksha Bagade
Rishika Bhagwatkar
Khurshed Fitter
P. Vagdargi
Shital S. Chiddarwar
26
0
0
06 Jan 2024
Improving Self-supervised Molecular Representation Learning using
  Persistent Homology
Improving Self-supervised Molecular Representation Learning using Persistent Homology
Yuankai Luo
Lei Shi
Veronika Thost
SSL
32
8
0
29 Nov 2023
CoSiNES: Contrastive Siamese Network for Entity Standardization
CoSiNES: Contrastive Siamese Network for Entity Standardization
Jiaqing Yuan
Michele Merler
M. Choudhury
Raju Pavuluri
Munindar P. Singh
M. Vukovic
18
0
0
05 Jun 2023
PDSum: Prototype-driven Continuous Summarization of Evolving
  Multi-document Sets Stream
PDSum: Prototype-driven Continuous Summarization of Evolving Multi-document Sets Stream
Susik Yoon
Hou Pong Chan
Jiawei Han
29
7
0
10 Feb 2023
Empirical Evaluation and Theoretical Analysis for Representation
  Learning: A Survey
Empirical Evaluation and Theoretical Analysis for Representation Learning: A Survey
Kento Nozawa
Issei Sato
AI4TS
19
4
0
18 Apr 2022
Fact Checking with Insufficient Evidence
Fact Checking with Insufficient Evidence
Pepa Atanasova
J. Simonsen
Christina Lioma
Isabelle Augenstein
37
14
0
05 Apr 2022
Neighborhood Contrastive Learning for Scientific Document
  Representations with Citation Embeddings
Neighborhood Contrastive Learning for Scientific Document Representations with Citation Embeddings
Malte Ostendorff
Nils Rethmeier
Isabelle Augenstein
Bela Gipp
Georg Rehm
19
73
0
14 Feb 2022
Contrastive Document Representation Learning with Graph Attention
  Networks
Contrastive Document Representation Learning with Graph Attention Networks
Peng-Tao Xu
Xinchi Chen
Xiaofei Ma
Zhiheng Huang
Bing Xiang
14
9
0
20 Oct 2021
Dense Contrastive Visual-Linguistic Pretraining
Dense Contrastive Visual-Linguistic Pretraining
Lei Shi
Kai Shuang
Shijie Geng
Peng Gao
Zuohui Fu
Gerard de Melo
Yunpeng Chen
Sen Su
VLM
SSL
52
10
0
24 Sep 2021
Prototypical Graph Contrastive Learning
Prototypical Graph Contrastive Learning
Shuai Lin
Pan Zhou
Zi-Yuan Hu
Shuojia Wang
Ruihui Zhao
Yefeng Zheng
Liang Lin
Eric P. Xing
Xiaodan Liang
18
86
0
17 Jun 2021
Disentangled Contrastive Learning for Learning Robust Textual
  Representations
Disentangled Contrastive Learning for Learning Robust Textual Representations
Xiang Chen
Xin Xie
Zhen Bi
Hongbin Ye
Shumin Deng
Ningyu Zhang
Huajun Chen
33
5
0
11 Apr 2021
Contrastive Learning Inverts the Data Generating Process
Contrastive Learning Inverts the Data Generating Process
Roland S. Zimmermann
Yash Sharma
Steffen Schneider
Matthias Bethge
Wieland Brendel
SSL
238
207
0
17 Feb 2021
COCO-LM: Correcting and Contrasting Text Sequences for Language Model
  Pretraining
COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Yu Meng
Chenyan Xiong
Payal Bajaj
Saurabh Tiwary
Paul N. Bennett
Jiawei Han
Xia Song
125
202
0
16 Feb 2021
Scaling Up Visual and Vision-Language Representation Learning With Noisy
  Text Supervision
Scaling Up Visual and Vision-Language Representation Learning With Noisy Text Supervision
Chao Jia
Yinfei Yang
Ye Xia
Yi-Ting Chen
Zarana Parekh
Hieu H. Pham
Quoc V. Le
Yun-hsuan Sung
Zhen Li
Tom Duerig
VLM
CLIP
298
3,700
0
11 Feb 2021
CoDA: Contrast-enhanced and Diversity-promoting Data Augmentation for
  Natural Language Understanding
CoDA: Contrast-enhanced and Diversity-promoting Data Augmentation for Natural Language Understanding
Yanru Qu
Dinghan Shen
Yelong Shen
Sandra Sajeev
Jiawei Han
Weizhu Chen
134
66
0
16 Oct 2020
A Mutual Information Maximization Perspective of Language Representation
  Learning
A Mutual Information Maximization Perspective of Language Representation Learning
Lingpeng Kong
Cyprien de Masson dÁutume
Wang Ling
Lei Yu
Zihang Dai
Dani Yogatama
SSL
214
165
0
18 Oct 2019
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomáš Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
239
31,257
0
16 Jan 2013
1