ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.13555
  4. Cited By
BiLD: Bi-directional Logits Difference Loss for Large Language Model
  Distillation

BiLD: Bi-directional Logits Difference Loss for Large Language Model Distillation

19 June 2024
Minchong Li
Feng Zhou
Xiaohui Song
ArXivPDFHTML

Papers citing "BiLD: Bi-directional Logits Difference Loss for Large Language Model Distillation"

3 / 3 papers shown
Title
DistiLLM-2: A Contrastive Approach Boosts the Distillation of LLMs
Jongwoo Ko
Tianyi Chen
Sungnyun Kim
Tianyu Ding
Luming Liang
Ilya Zharkov
Se-Young Yun
VLM
168
0
0
10 Mar 2025
Extreme Compression of Large Language Models via Additive Quantization
Extreme Compression of Large Language Models via Additive Quantization
Vage Egiazarian
Andrei Panferov
Denis Kuznedelev
Elias Frantar
Artem Babenko
Dan Alistarh
MQ
100
90
0
11 Jan 2024
PromptMix: A Class Boundary Augmentation Method for Large Language Model
  Distillation
PromptMix: A Class Boundary Augmentation Method for Large Language Model Distillation
Gaurav Sahu
Olga Vechtomova
Dzmitry Bahdanau
I. Laradji
VLM
55
24
0
22 Oct 2023
1