Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.13116
Cited By
A Survey on Knowledge Distillation of Large Language Models
20 February 2024
Xiaohan Xu
Ming Li
Chongyang Tao
Tao Shen
Reynold Cheng
Jinyang Li
Can Xu
Dacheng Tao
Dinesh Manocha
KELM
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A Survey on Knowledge Distillation of Large Language Models"
4 / 104 papers shown
Title
Overcoming Catastrophic Forgetting by Incremental Moment Matching
Sang-Woo Lee
Jin-Hwa Kim
Jaehyun Jun
Jung-Woo Ha
Byoung-Tak Zhang
CLL
61
674
0
24 Mar 2017
Overcoming catastrophic forgetting in neural networks
J. Kirkpatrick
Razvan Pascanu
Neil C. Rabinowitz
J. Veness
Guillaume Desjardins
...
A. Grabska-Barwinska
Demis Hassabis
Claudia Clopath
D. Kumaran
R. Hadsell
CLL
288
7,410
0
02 Dec 2016
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding
Song Han
Huizi Mao
W. Dally
3DGS
214
8,793
0
01 Oct 2015
Flickr30k Entities: Collecting Region-to-Phrase Correspondences for Richer Image-to-Sentence Models
Bryan A. Plummer
Liwei Wang
Christopher M. Cervantes
Juan C. Caicedo
Julia Hockenmaier
Svetlana Lazebnik
182
2,033
0
19 May 2015
Previous
1
2
3