Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2102.03752
Cited By
CSS-LM: A Contrastive Framework for Semi-supervised Fine-tuning of Pre-trained Language Models
7 February 2021
Yusheng Su
Xu Han
Yankai Lin
Zhengyan Zhang
Zhiyuan Liu
Peng Li
Jie Zhou
Maosong Sun
Re-assign community
ArXiv
PDF
HTML
Papers citing
"CSS-LM: A Contrastive Framework for Semi-supervised Fine-tuning of Pre-trained Language Models"
5 / 5 papers shown
Title
Vesper: A Compact and Effective Pretrained Model for Speech Emotion Recognition
Weidong Chen
Xiaofen Xing
Peihao Chen
Xiangmin Xu
VLM
30
35
0
20 Jul 2023
A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned and Perspectives
Nils Rethmeier
Isabelle Augenstein
SSL
VLM
90
90
0
25 Feb 2021
A Mutual Information Maximization Perspective of Language Representation Learning
Lingpeng Kong
Cyprien de Masson dÁutume
Wang Ling
Lei Yu
Zihang Dai
Dani Yogatama
SSL
226
165
0
18 Oct 2019
Knowledge Enhanced Contextual Word Representations
Matthew E. Peters
Mark Neumann
IV RobertL.Logan
Roy Schwartz
Vidur Joshi
Sameer Singh
Noah A. Smith
234
656
0
09 Sep 2019
Text Summarization with Pretrained Encoders
Yang Liu
Mirella Lapata
MILM
258
1,433
0
22 Aug 2019
1