Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2506.14728
Cited By
AgentDistill: Training-Free Agent Distillation with Generalizable MCP Boxes
17 June 2025
Jiahao Qiu
Xinzhe Juan
Yimin Wang
L. Yang
Xuan Qi
Tongcheng Zhang
Jiacheng Guo
Yifu Lu
Zixin Yao
Hongru Wang
Shilong Liu
Xun Jiang
Liu Leqi
Mengdi Wang
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"AgentDistill: Training-Free Agent Distillation with Generalizable MCP Boxes"
6 / 6 papers shown
Title
Distilling LLM Agent into Small Models with Retrieval and Code Tools
Minki Kang
Jongwon Jeong
Seanie Lee
Jaewoong Cho
Sung Ju Hwang
LRM
232
2
0
23 May 2025
ProtAgents: Protein discovery via large language model multi-agent collaborations combining physics and machine learning
Alireza Ghafarollahi
Markus J. Buehler
LLMAG
AI4CE
56
31
0
27 Jan 2024
Symbolic Chain-of-Thought Distillation: Small Models Can Also "Think" Step-by-Step
Liunian Harold Li
Jack Hessel
Youngjae Yu
Xiang Ren
Kai-Wei Chang
Yejin Choi
LRM
AI4CE
ReLM
84
143
0
24 Jun 2023
SLAKE: A Semantically-Labeled Knowledge-Enhanced Dataset for Medical Visual Question Answering
Bo Liu
Li-Ming Zhan
Li Xu
Lin Ma
Y. Yang
Xiao-Ming Wu
78
270
0
18 Feb 2021
MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices
Zhiqing Sun
Hongkun Yu
Xiaodan Song
Renjie Liu
Yiming Yang
Denny Zhou
MQ
109
817
0
06 Apr 2020
MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers
Wenhui Wang
Furu Wei
Li Dong
Hangbo Bao
Nan Yang
Ming Zhou
VLM
156
1,278
0
25 Feb 2020
1