Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2308.06453
Cited By
Multi-Label Knowledge Distillation
12 August 2023
Penghui Yang
Ming-Kun Xie
Chen-Chen Zong
Lei Feng
Gang Niu
Masashi Sugiyama
Sheng-Jun Huang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multi-Label Knowledge Distillation"
9 / 9 papers shown
Title
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Junjie Yang
Junhao Song
Xudong Han
Ziqian Bi
Tianyang Wang
...
Y. Zhang
Qian Niu
Benji Peng
Keyu Chen
Ming Liu
VLM
47
0
0
18 Apr 2025
Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head
Penghui Yang
Chen-Chen Zong
Sheng-Jun Huang
Lei Feng
Bo An
33
1
0
13 Nov 2024
SSPA: Split-and-Synthesize Prompting with Gated Alignments for Multi-Label Image Recognition
Hao Tan
Zichang Tan
Jun Li
Jun Wan
Zhen Lei
Stan Z. Li
VLM
32
1
0
30 Jul 2024
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation
Jinbin Huang
Wenbin He
Liang Gou
Liu Ren
Chris Bryan
47
0
0
25 Jun 2024
Robust Synthetic-to-Real Transfer for Stereo Matching
Jiawei Zhang
Jiahe Li
Lei Huang
Xiaohan Yu
Lin Gu
Jin Zheng
Xiao Bai
OOD
30
10
0
12 Mar 2024
Good Teachers Explain: Explanation-Enhanced Knowledge Distillation
Amin Parchami-Araghi
Moritz Bohle
Sukrut Rao
Bernt Schiele
FAtt
8
3
0
05 Feb 2024
PVLR: Prompt-driven Visual-Linguistic Representation Learning for Multi-Label Image Recognition
Hao Tan
Zichang Tan
Jun Li
Jun Wan
Zhen Lei
VLM
25
0
0
31 Jan 2024
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
149
420
0
19 Apr 2021
RepVGG: Making VGG-style ConvNets Great Again
Xiaohan Ding
X. Zhang
Ningning Ma
Jungong Han
Guiguang Ding
Jian-jun Sun
136
1,546
0
11 Jan 2021
1