ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.08350
  4. Cited By
A Mutual Information Maximization Perspective of Language Representation
  Learning

A Mutual Information Maximization Perspective of Language Representation Learning

18 October 2019
Lingpeng Kong
Cyprien de Masson dÁutume
Wang Ling
Lei Yu
Zihang Dai
Dani Yogatama
    SSL
ArXivPDFHTML

Papers citing "A Mutual Information Maximization Perspective of Language Representation Learning"

50 / 105 papers shown
Title
Analyzing and Improving the Optimization Landscape of Noise-Contrastive
  Estimation
Analyzing and Improving the Optimization Landscape of Noise-Contrastive Estimation
Bingbin Liu
Elan Rosenfeld
Pradeep Ravikumar
Andrej Risteski
23
13
0
21 Oct 2021
Distiller: A Systematic Study of Model Distillation Methods in Natural
  Language Processing
Distiller: A Systematic Study of Model Distillation Methods in Natural Language Processing
Haoyu He
Xingjian Shi
Jonas W. Mueller
Zha Sheng
Mu Li
George Karypis
16
9
0
23 Sep 2021
Improving Multimodal fusion via Mutual Dependency Maximisation
Improving Multimodal fusion via Mutual Dependency Maximisation
Pierre Colombo
E. Chapuis
Matthieu Labeau
Chloé Clavel
13
30
0
31 Aug 2021
Deep Dive into Semi-Supervised ELBO for Improving Classification
  Performance
Deep Dive into Semi-Supervised ELBO for Improving Classification Performance
Fahim Faisal Niloy
M. A. Amin
Akm Mahbubur Rahman
A. Ali
DRL
25
0
0
29 Aug 2021
ProtoInfoMax: Prototypical Networks with Mutual Information Maximization
  for Out-of-Domain Detection
ProtoInfoMax: Prototypical Networks with Mutual Information Maximization for Out-of-Domain Detection
Iftitahu Ni'mah
Meng Fang
Vlado Menkovski
Mykola Pechenizkiy
35
5
0
27 Aug 2021
An Evaluation of Generative Pre-Training Model-based Therapy Chatbot for
  Caregivers
An Evaluation of Generative Pre-Training Model-based Therapy Chatbot for Caregivers
Lu Wang
Munif Ishad Mujib
Jake Williams
G. Demiris
Jina Huh-Yoo
AI4MH
27
32
0
28 Jul 2021
Align before Fuse: Vision and Language Representation Learning with
  Momentum Distillation
Align before Fuse: Vision and Language Representation Learning with Momentum Distillation
Junnan Li
Ramprasaath R. Selvaraju
Akhilesh Deepak Gotmare
Chenyu You
Caiming Xiong
Guosheng Lin
FaML
71
1,889
0
16 Jul 2021
Improving Sequential Recommendation Consistency with Self-Supervised
  Imitation
Improving Sequential Recommendation Consistency with Self-Supervised Imitation
Xu Yuan
Hongshen Chen
Yonghao Song
Xiaofang Zhao
Zhuoye Ding
Zhen He
Bo Long
13
22
0
26 Jun 2021
Evaluating Modules in Graph Contrastive Learning
Evaluating Modules in Graph Contrastive Learning
Ganqu Cui
Y. Du
Cheng Yang
Jie Zhou
Liang Xu
Xing Zhou
Lifeng Wang
Zhiyuan Liu
23
3
0
15 Jun 2021
Pre-Trained Models: Past, Present and Future
Pre-Trained Models: Past, Present and Future
Xu Han
Zhengyan Zhang
Ning Ding
Yuxian Gu
Xiao Liu
...
Jie Tang
Ji-Rong Wen
Jinhui Yuan
Wayne Xin Zhao
Jun Zhu
AIFin
MQ
AI4MH
40
815
0
14 Jun 2021
Hybrid Generative-Contrastive Representation Learning
Hybrid Generative-Contrastive Representation Learning
Saehoon Kim
Sungwoong Kim
Juho Lee
SSL
22
11
0
11 Jun 2021
Self-Supervised Graph Learning with Proximity-based Views and Channel
  Contrast
Self-Supervised Graph Learning with Proximity-based Views and Channel Contrast
Wei Zhuo
Guang Tan
SSL
19
0
0
07 Jun 2021
Understand and Improve Contrastive Learning Methods for Visual
  Representation: A Review
Understand and Improve Contrastive Learning Methods for Visual Representation: A Review
Ran Liu
SSL
29
12
0
06 Jun 2021
CLEVE: Contrastive Pre-training for Event Extraction
CLEVE: Contrastive Pre-training for Event Extraction
Ziqi Wang
Xiaozhi Wang
Xu Han
Yankai Lin
Lei Hou
Zhiyuan Liu
Peng Li
Juan-Zi Li
Jie Zhou
37
116
0
30 May 2021
Early Exiting with Ensemble Internal Classifiers
Early Exiting with Ensemble Internal Classifiers
Tianxiang Sun
Yunhua Zhou
Xiangyang Liu
Xinyu Zhang
Hao Jiang
Bo Zhao
Xuanjing Huang
Xipeng Qiu
32
30
0
28 May 2021
Rethinking InfoNCE: How Many Negative Samples Do You Need?
Rethinking InfoNCE: How Many Negative Samples Do You Need?
Chuhan Wu
Fangzhao Wu
Yongfeng Huang
27
42
0
27 May 2021
Divide and Contrast: Self-supervised Learning from Uncurated Data
Divide and Contrast: Self-supervised Learning from Uncurated Data
Yonglong Tian
Olivier J. Hénaff
Aaron van den Oord
SSL
64
96
0
17 May 2021
On Sampling-Based Training Criteria for Neural Language Modeling
On Sampling-Based Training Criteria for Neural Language Modeling
Yingbo Gao
David Thulke
Alexander Gerstenberger
Viet Anh Khoa Tran
Ralf Schluter
Hermann Ney
17
2
0
21 Apr 2021
Composable Augmentation Encoding for Video Representation Learning
Composable Augmentation Encoding for Video Representation Learning
Chen Sun
Arsha Nagrani
Yonglong Tian
Cordelia Schmid
SSL
AI4TS
37
17
0
01 Apr 2021
Self-supervised Representation Learning with Relative Predictive Coding
Self-supervised Representation Learning with Relative Predictive Coding
Yao-Hung Hubert Tsai
Martin Q. Ma
Muqiao Yang
Han Zhao
Louis-Philippe Morency
Ruslan Salakhutdinov
SSL
AI4TS
30
36
0
21 Mar 2021
Structure Inducing Pre-Training
Structure Inducing Pre-Training
Matthew B. A. McDermott
Brendan Yap
Peter Szolovits
Marinka Zitnik
42
18
0
18 Mar 2021
Cross-modal Image Retrieval with Deep Mutual Information Maximization
Cross-modal Image Retrieval with Deep Mutual Information Maximization
Chunbin Gu
Jiajun Bu
Xixi Zhou
Chengwei Yao
Dongfang Ma
Zhi Yu
Xifeng Yan
15
16
0
10 Mar 2021
The Rediscovery Hypothesis: Language Models Need to Meet Linguistics
The Rediscovery Hypothesis: Language Models Need to Meet Linguistics
Vassilina Nikoulina
Maxat Tezekbayev
Nuradil Kozhakhmet
Madina Babazhanova
Matthias Gallé
Z. Assylbekov
34
8
0
02 Mar 2021
A Primer on Contrastive Pretraining in Language Processing: Methods,
  Lessons Learned and Perspectives
A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned and Perspectives
Nils Rethmeier
Isabelle Augenstein
SSL
VLM
90
90
0
25 Feb 2021
CSS-LM: A Contrastive Framework for Semi-supervised Fine-tuning of
  Pre-trained Language Models
CSS-LM: A Contrastive Framework for Semi-supervised Fine-tuning of Pre-trained Language Models
Yusheng Su
Xu Han
Yankai Lin
Zhengyan Zhang
Zhiyuan Liu
Peng Li
Jie Zhou
Maosong Sun
11
10
0
07 Feb 2021
ERICA: Improving Entity and Relation Understanding for Pre-trained
  Language Models via Contrastive Learning
ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning
Yujia Qin
Yankai Lin
Ryuichi Takanobu
Zhiyuan Liu
Peng Li
Heng Ji
Minlie Huang
Maosong Sun
Jie Zhou
55
125
0
30 Dec 2020
Evolution Is All You Need: Phylogenetic Augmentation for Contrastive
  Learning
Evolution Is All You Need: Phylogenetic Augmentation for Contrastive Learning
Amy X. Lu
Alex X. Lu
Alan M. Moses
SSL
25
13
0
25 Dec 2020
Discriminative, Generative and Self-Supervised Approaches for
  Target-Agnostic Learning
Discriminative, Generative and Self-Supervised Approaches for Target-Agnostic Learning
Yuan Jin
Wray L. Buntine
F. Petitjean
Geoffrey I. Webb
SSL
25
1
0
12 Nov 2020
Latte-Mix: Measuring Sentence Semantic Similarity with Latent
  Categorical Mixtures
Latte-Mix: Measuring Sentence Semantic Similarity with Latent Categorical Mixtures
Minghan Li
He Bai
Luchen Tan
Kun Xiong
Ming Li
Jimmy J. Lin
FedML
17
0
0
21 Oct 2020
Contrastive Representation Learning: A Framework and Review
Contrastive Representation Learning: A Framework and Review
Phúc H. Lê Khắc
Graham Healy
Alan F. Smeaton
SSL
AI4TS
178
686
0
10 Oct 2020
InfoBERT: Improving Robustness of Language Models from An Information
  Theoretic Perspective
InfoBERT: Improving Robustness of Language Models from An Information Theoretic Perspective
Wei Ping
Shuohang Wang
Yu Cheng
Zhe Gan
R. Jia
Bo-wen Li
Jingjing Liu
AAML
46
113
0
05 Oct 2020
Which *BERT? A Survey Organizing Contextualized Encoders
Which *BERT? A Survey Organizing Contextualized Encoders
Patrick Xia
Shijie Wu
Benjamin Van Durme
26
50
0
02 Oct 2020
An Unsupervised Sentence Embedding Method by Mutual Information
  Maximization
An Unsupervised Sentence Embedding Method by Mutual Information Maximization
Yan Zhang
Ruidan He
Zuozhu Liu
Kwan Hui Lim
Lidong Bing
SSL
22
178
0
25 Sep 2020
Improving Robustness and Generality of NLP Models Using Disentangled
  Representations
Improving Robustness and Generality of NLP Models Using Disentangled Representations
Jiawei Wu
Xiaoya Li
Xiang Ao
Yuxian Meng
Fei Wu
Jiwei Li
OOD
DRL
14
11
0
21 Sep 2020
Self-Supervised Contrastive Learning for Code Retrieval and
  Summarization via Semantic-Preserving Transformations
Self-Supervised Contrastive Learning for Code Retrieval and Summarization via Semantic-Preserving Transformations
Nghi D. Q. Bui
Yijun Yu
Lingxiao Jiang
SSL
32
119
0
06 Sep 2020
S^3-Rec: Self-Supervised Learning for Sequential Recommendation with
  Mutual Information Maximization
S^3-Rec: Self-Supervised Learning for Sequential Recommendation with Mutual Information Maximization
Kun Zhou
Haibo Wang
Wayne Xin Zhao
Yutao Zhu
Sirui Wang
Fuzheng Zhang
Zhongyuan Wang
Ji-Rong Wen
36
783
0
18 Aug 2020
On Learning Universal Representations Across Languages
On Learning Universal Representations Across Languages
Xiangpeng Wei
Rongxiang Weng
Yue Hu
Luxi Xing
Heng Yu
Weihua Luo
SSL
VLM
33
85
0
31 Jul 2020
InfoMax-GAN: Improved Adversarial Image Generation via Information
  Maximization and Contrastive Learning
InfoMax-GAN: Improved Adversarial Image Generation via Information Maximization and Contrastive Learning
Kwot Sin Lee
Ngoc-Trung Tran
Ngai-man Cheung
GAN
21
66
0
09 Jul 2020
A Survey on Self-supervised Pre-training for Sequential Transfer
  Learning in Neural Networks
A Survey on Self-supervised Pre-training for Sequential Transfer Learning in Neural Networks
H. H. Mao
BDL
SSL
6
50
0
01 Jul 2020
Telescoping Density-Ratio Estimation
Telescoping Density-Ratio Estimation
Benjamin Rhodes
Kai Xu
Michael U. Gutmann
25
94
0
22 Jun 2020
Self-supervised Learning: Generative or Contrastive
Self-supervised Learning: Generative or Contrastive
Xiao Liu
Fanjin Zhang
Zhenyu Hou
Zhaoyu Wang
Li Mian
Jing Zhang
Jie Tang
SSL
50
1,586
0
15 Jun 2020
Self-supervised Learning from a Multi-view Perspective
Self-supervised Learning from a Multi-view Perspective
Yao-Hung Hubert Tsai
Yue Wu
Ruslan Salakhutdinov
Louis-Philippe Morency
SSL
25
30
0
10 Jun 2020
Neural Methods for Point-wise Dependency Estimation
Neural Methods for Point-wise Dependency Estimation
Yao-Hung Hubert Tsai
Han Zhao
M. Yamada
Louis-Philippe Morency
Ruslan Salakhutdinov
25
31
0
09 Jun 2020
DeCLUTR: Deep Contrastive Learning for Unsupervised Textual
  Representations
DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations
John Giorgi
Osvald Nitski
Bo Wang
Gary D. Bader
SSL
39
489
0
05 Jun 2020
Funnel-Transformer: Filtering out Sequential Redundancy for Efficient
  Language Processing
Funnel-Transformer: Filtering out Sequential Redundancy for Efficient Language Processing
Zihang Dai
Guokun Lai
Yiming Yang
Quoc V. Le
48
229
0
05 Jun 2020
An efficient manifold density estimator for all recommendation systems
An efficient manifold density estimator for all recommendation systems
Jacek Dkabrowski
Barbara Rychalska
Michal Daniluk
Dominika Basaj
Konrad Gołuchowski
Piotr Babel
Andrzej Michalowski
Adam Jakubowski
6
19
0
02 Jun 2020
On Mutual Information in Contrastive Learning for Visual Representations
On Mutual Information in Contrastive Learning for Visual Representations
Mike Wu
Chengxu Zhuang
Milan Mossé
Daniel L. K. Yamins
Noah D. Goodman
SSL
23
82
0
27 May 2020
What Makes for Good Views for Contrastive Learning?
What Makes for Good Views for Contrastive Learning?
Yonglong Tian
Chen Sun
Ben Poole
Dilip Krishnan
Cordelia Schmid
Phillip Isola
SSL
39
1,307
0
20 May 2020
What are the Goals of Distributional Semantics?
What are the Goals of Distributional Semantics?
Guy Edward Toh Emerson
11
26
0
06 May 2020
The Effect of Natural Distribution Shift on Question Answering Models
The Effect of Natural Distribution Shift on Question Answering Models
John Miller
K. Krauth
Benjamin Recht
Ludwig Schmidt
OOD
15
143
0
29 Apr 2020
Previous
123
Next