Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2204.04667
Cited By
v1
v2 (latest)
Linear Complexity Randomized Self-attention Mechanism
10 April 2022
Lin Zheng
Chong-Jun Wang
Lingpeng Kong
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Linear Complexity Randomized Self-attention Mechanism"
7 / 7 papers shown
Title
SecFwT: Efficient Privacy-Preserving Fine-Tuning of Large Language Models Using Forward-Only Passes
Jinglong Luo
Zhuo Zhang
Yehong Zhang
Shiyu Liu
Ye Dong
Xun Zhou
Hui Wang
Yue Yu
Zenglin Xu
12
0
0
18 Jun 2025
Scaling Reasoning without Attention
Xueliang Zhao
Wei Wu
Lingpeng Kong
OffRL
ReLM
LRM
VLM
65
0
0
28 May 2025
Linear Attention Sequence Parallelism
Weigao Sun
Zhen Qin
Dong Li
Xuyang Shen
Yu Qiao
Yiran Zhong
141
2
0
03 Apr 2024
The Devil in Linear Transformer
Zhen Qin
Xiaodong Han
Weixuan Sun
Dongxu Li
Lingpeng Kong
Nick Barnes
Yiran Zhong
79
74
0
19 Oct 2022
Linear Video Transformer with Feature Fixation
Kaiyue Lu
Zexia Liu
Jianyuan Wang
Weixuan Sun
Zhen Qin
...
Xuyang Shen
Huizhong Deng
Xiaodong Han
Yuchao Dai
Yiran Zhong
107
5
0
15 Oct 2022
CAB: Comprehensive Attention Benchmarking on Long Sequence Modeling
Jinchao Zhang
Shuyang Jiang
Jiangtao Feng
Lin Zheng
Dianbo Sui
3DV
180
9
0
14 Oct 2022
Attention and Self-Attention in Random Forests
Lev V. Utkin
A. Konstantinov
68
6
0
09 Jul 2022
1