Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.14741
Cited By
CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence
17 October 2024
Zao Zhang
Huaming Chen
Pei Ning
Nan Yang
Dong Yuan
Re-assign community
ArXiv
PDF
HTML
Papers citing
"CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence"
1 / 1 papers shown
Title
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Junjie Yang
Junhao Song
Xudong Han
Ziqian Bi
Tianyang Wang
...
Yujie Zhang
Qian Niu
Benji Peng
Keyu Chen
Ming Liu
VLM
47
0
0
18 Apr 2025
1