ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.06414
  4. Cited By
Evolving Knowledge Distillation with Large Language Models and Active
  Learning

Evolving Knowledge Distillation with Large Language Models and Active Learning

11 March 2024
Chengyuan Liu
Yangyang Kang
Fubang Zhao
Kun Kuang
Zhuoren Jiang
Changlong Sun
Fei Wu
ArXivPDFHTML

Papers citing "Evolving Knowledge Distillation with Large Language Models and Active Learning"

5 / 5 papers shown
Title
LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale
  Instructions
LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale Instructions
Minghao Wu
Abdul Waheed
Chiyu Zhang
Muhammad Abdul-Mageed
Alham Fikri Aji
ALM
135
119
0
27 Apr 2023
ZeroShotDataAug: Generating and Augmenting Training Data with ChatGPT
ZeroShotDataAug: Generating and Augmenting Training Data with ChatGPT
S. Ubani
Suleyman O. Polat
Rodney D. Nielsen
97
52
0
27 Apr 2023
Self-Guided Noise-Free Data Generation for Efficient Zero-Shot Learning
Self-Guided Noise-Free Data Generation for Efficient Zero-Shot Learning
Jiahui Gao
Renjie Pi
Yong Lin
Hang Xu
Jiacheng Ye
Zhiyong Wu
Weizhong Zhang
Xiaodan Liang
Zhenguo Li
Lingpeng Kong
SyDa
VLM
70
45
0
25 May 2022
Cold-start Active Learning through Self-supervised Language Modeling
Cold-start Active Learning through Self-supervised Language Modeling
Michelle Yuan
Hsuan-Tien Lin
Jordan L. Boyd-Graber
116
180
0
19 Oct 2020
Fine-Tuning Language Models from Human Preferences
Fine-Tuning Language Models from Human Preferences
Daniel M. Ziegler
Nisan Stiennon
Jeff Wu
Tom B. Brown
Alec Radford
Dario Amodei
Paul Christiano
G. Irving
ALM
292
1,595
0
18 Sep 2019
1