ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.08674
  4. Cited By
Wasserstein Contrastive Representation Distillation

Wasserstein Contrastive Representation Distillation

15 December 2020
Liqun Chen
Dong Wang
Zhe Gan
Jingjing Liu
Ricardo Henao
Lawrence Carin
ArXivPDFHTML

Papers citing "Wasserstein Contrastive Representation Distillation"

17 / 17 papers shown
Title
Your contrastive learning problem is secretly a distribution alignment problem
Your contrastive learning problem is secretly a distribution alignment problem
Zihao Chen
Chi-Heng Lin
Ran Liu
Jingyun Xiao
Eva L. Dyer
75
1
0
27 Feb 2025
Quantifying Knowledge Distillation Using Partial Information Decomposition
Quantifying Knowledge Distillation Using Partial Information Decomposition
Pasan Dissanayake
Faisal Hamman
Barproda Halder
Ilia Sucholutsky
Qiuyi Zhang
Sanghamitra Dutta
36
0
0
12 Nov 2024
Data-free Knowledge Distillation for Fine-grained Visual Categorization
Data-free Knowledge Distillation for Fine-grained Visual Categorization
Renrong Shao
Wei Zhang
Jianhua Yin
Jun Wang
33
2
0
18 Apr 2024
Learning to Project for Cross-Task Knowledge Distillation
Learning to Project for Cross-Task Knowledge Distillation
Dylan Auty
Roy Miles
Benedikt Kolbeinsson
K. Mikolajczyk
40
0
0
21 Mar 2024
VideoAdviser: Video Knowledge Distillation for Multimodal Transfer
  Learning
VideoAdviser: Video Knowledge Distillation for Multimodal Transfer Learning
Yanan Wang
Donghuo Zeng
Shinya Wada
Satoshi Kurihara
32
6
0
27 Sep 2023
Towards Personalized Review Summarization by Modeling Historical Reviews
  from Customer and Product Separately
Towards Personalized Review Summarization by Modeling Historical Reviews from Customer and Product Separately
Xin Cheng
Shen Gao
Yuchi Zhang
Yongliang Wang
Xiuying Chen
Mingzhe Li
Dongyan Zhao
Rui Yan
29
10
0
27 Jan 2023
Exploring Content Relationships for Distilling Efficient GANs
Exploring Content Relationships for Distilling Efficient GANs
Lizhou You
Mingbao Lin
Tie Hu
Rongrong Ji
Rongrong Ji
41
3
0
21 Dec 2022
Multimodal Transformer Distillation for Audio-Visual Synchronization
Multimodal Transformer Distillation for Audio-Visual Synchronization
Xuan-Bo Chen
Haibin Wu
Chung-Che Wang
Hung-yi Lee
J. Jang
26
3
0
27 Oct 2022
Meta-Ensemble Parameter Learning
Meta-Ensemble Parameter Learning
Zhengcong Fei
Shuman Tian
Junshi Huang
Xiaoming Wei
Xiaolin K. Wei
OOD
44
2
0
05 Oct 2022
DeepWSD: Projecting Degradations in Perceptual Space to Wasserstein
  Distance in Deep Feature Space
DeepWSD: Projecting Degradations in Perceptual Space to Wasserstein Distance in Deep Feature Space
Xigran Liao
Baoliang Chen
Hanwei Zhu
Shiqi Wang
Mingliang Zhou
Sam Kwong
24
23
0
05 Aug 2022
Modality-Aware Contrastive Instance Learning with Self-Distillation for
  Weakly-Supervised Audio-Visual Violence Detection
Modality-Aware Contrastive Instance Learning with Self-Distillation for Weakly-Supervised Audio-Visual Violence Detection
Jiashuo Yu
Jin-Yuan Liu
Ying Cheng
Rui Feng
Yuejie Zhang
21
35
0
12 Jul 2022
Generalized Knowledge Distillation via Relationship Matching
Generalized Knowledge Distillation via Relationship Matching
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
22
20
0
04 May 2022
Multimodal Adaptive Distillation for Leveraging Unimodal Encoders for
  Vision-Language Tasks
Multimodal Adaptive Distillation for Leveraging Unimodal Encoders for Vision-Language Tasks
Zhecan Wang
Noel Codella
Yen-Chun Chen
Luowei Zhou
Xiyang Dai
...
Jianwei Yang
Haoxuan You
Kai-Wei Chang
Shih-Fu Chang
Lu Yuan
VLM
OffRL
31
22
0
22 Apr 2022
Fine-Tuning Graph Neural Networks via Graph Topology induced Optimal
  Transport
Fine-Tuning Graph Neural Networks via Graph Topology induced Optimal Transport
Jiying Zhang
Xi Xiao
Long-Kai Huang
Yu Rong
Yatao Bian
OT
11
32
0
20 Mar 2022
Information Theoretic Representation Distillation
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
13
21
0
01 Dec 2021
Compressing Visual-linguistic Model via Knowledge Distillation
Compressing Visual-linguistic Model via Knowledge Distillation
Zhiyuan Fang
Jianfeng Wang
Xiaowei Hu
Lijuan Wang
Yezhou Yang
Zicheng Liu
VLM
39
97
0
05 Apr 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
200
473
0
12 Jun 2018
1