Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2505.19504
Cited By
DOGe: Defensive Output Generation for LLM Protection Against Knowledge Distillation
26 May 2025
Pingzhi Li
Zhen Tan
Huaizhi Qu
Huan Liu
Tianlong Chen
AAML
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"DOGe: Defensive Output Generation for LLM Protection Against Knowledge Distillation"
1 / 1 papers shown
Title
Multi-Level Optimal Transport for Universal Cross-Tokenizer Knowledge Distillation on Language Models
Xiao Cui
Mo Zhu
Yulei Qin
Liang Xie
Wengang Zhou
Haoyang Li
177
9
0
19 Dec 2024
1