ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.13116
  4. Cited By
A Survey on Knowledge Distillation of Large Language Models

A Survey on Knowledge Distillation of Large Language Models

20 February 2024
Xiaohan Xu
Ming Li
Chongyang Tao
Tao Shen
Reynold Cheng
Jinyang Li
Can Xu
Dacheng Tao
Dinesh Manocha
    KELM
    VLM
ArXivPDFHTML

Papers citing "A Survey on Knowledge Distillation of Large Language Models"

4 / 104 papers shown
Title
Overcoming Catastrophic Forgetting by Incremental Moment Matching
Overcoming Catastrophic Forgetting by Incremental Moment Matching
Sang-Woo Lee
Jin-Hwa Kim
Jaehyun Jun
Jung-Woo Ha
Byoung-Tak Zhang
CLL
61
674
0
24 Mar 2017
Overcoming catastrophic forgetting in neural networks
Overcoming catastrophic forgetting in neural networks
J. Kirkpatrick
Razvan Pascanu
Neil C. Rabinowitz
J. Veness
Guillaume Desjardins
...
A. Grabska-Barwinska
Demis Hassabis
Claudia Clopath
D. Kumaran
R. Hadsell
CLL
288
7,410
0
02 Dec 2016
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained
  Quantization and Huffman Coding
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding
Song Han
Huizi Mao
W. Dally
3DGS
214
8,793
0
01 Oct 2015
Flickr30k Entities: Collecting Region-to-Phrase Correspondences for
  Richer Image-to-Sentence Models
Flickr30k Entities: Collecting Region-to-Phrase Correspondences for Richer Image-to-Sentence Models
Bryan A. Plummer
Liwei Wang
Christopher M. Cervantes
Juan C. Caicedo
Julia Hockenmaier
Svetlana Lazebnik
182
2,033
0
19 May 2015
Previous
123