Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2601.16515
Cited By
SALAD: Achieve High-Sparsity Attention via Efficient Linear Attention Tuning for Video Diffusion Transformer
23 January 2026
Tongcheng Fang
Hanling Zhang
Ruiqi Xie
Zhuo Han
Xin Tao
Tianchen Zhao
Pengfei Wan
Wenbo Ding
Wanli Ouyang
Xuefei Ning
Yu Wang
VGen
Re-assign community
ArXiv (abs)
PDF
HTML
HuggingFace (15 upvotes)
Papers citing
"SALAD: Achieve High-Sparsity Attention via Efficient Linear Attention Tuning for Video Diffusion Transformer"
0 / 0 papers shown
No papers found
Page 1 of 0