Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2403.06414
Cited By
Evolving Knowledge Distillation with Large Language Models and Active Learning
11 March 2024
Chengyuan Liu
Yangyang Kang
Fubang Zhao
Kun Kuang
Zhuoren Jiang
Changlong Sun
Fei Wu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Evolving Knowledge Distillation with Large Language Models and Active Learning"
5 / 5 papers shown
Title
LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale Instructions
Minghao Wu
Abdul Waheed
Chiyu Zhang
Muhammad Abdul-Mageed
Alham Fikri Aji
ALM
135
119
0
27 Apr 2023
ZeroShotDataAug: Generating and Augmenting Training Data with ChatGPT
S. Ubani
Suleyman O. Polat
Rodney D. Nielsen
97
52
0
27 Apr 2023
Self-Guided Noise-Free Data Generation for Efficient Zero-Shot Learning
Jiahui Gao
Renjie Pi
Yong Lin
Hang Xu
Jiacheng Ye
Zhiyong Wu
Weizhong Zhang
Xiaodan Liang
Zhenguo Li
Lingpeng Kong
SyDa
VLM
70
45
0
25 May 2022
Cold-start Active Learning through Self-supervised Language Modeling
Michelle Yuan
Hsuan-Tien Lin
Jordan L. Boyd-Graber
116
180
0
19 Oct 2020
Fine-Tuning Language Models from Human Preferences
Daniel M. Ziegler
Nisan Stiennon
Jeff Wu
Tom B. Brown
Alec Radford
Dario Amodei
Paul Christiano
G. Irving
ALM
292
1,595
0
18 Sep 2019
1