ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.04667
  4. Cited By
Linear Complexity Randomized Self-attention Mechanism
v1v2 (latest)

Linear Complexity Randomized Self-attention Mechanism

10 April 2022
Lin Zheng
Chong-Jun Wang
Lingpeng Kong
ArXiv (abs)PDFHTML

Papers citing "Linear Complexity Randomized Self-attention Mechanism"

7 / 7 papers shown
Title
SecFwT: Efficient Privacy-Preserving Fine-Tuning of Large Language Models Using Forward-Only Passes
SecFwT: Efficient Privacy-Preserving Fine-Tuning of Large Language Models Using Forward-Only Passes
Jinglong Luo
Zhuo Zhang
Yehong Zhang
Shiyu Liu
Ye Dong
Xun Zhou
Hui Wang
Yue Yu
Zenglin Xu
12
0
0
18 Jun 2025
Scaling Reasoning without Attention
Scaling Reasoning without Attention
Xueliang Zhao
Wei Wu
Lingpeng Kong
OffRLReLMLRMVLM
65
0
0
28 May 2025
Linear Attention Sequence Parallelism
Linear Attention Sequence Parallelism
Weigao Sun
Zhen Qin
Dong Li
Xuyang Shen
Yu Qiao
Yiran Zhong
141
2
0
03 Apr 2024
The Devil in Linear Transformer
The Devil in Linear Transformer
Zhen Qin
Xiaodong Han
Weixuan Sun
Dongxu Li
Lingpeng Kong
Nick Barnes
Yiran Zhong
79
74
0
19 Oct 2022
Linear Video Transformer with Feature Fixation
Linear Video Transformer with Feature Fixation
Kaiyue Lu
Zexia Liu
Jianyuan Wang
Weixuan Sun
Zhen Qin
...
Xuyang Shen
Huizhong Deng
Xiaodong Han
Yuchao Dai
Yiran Zhong
107
5
0
15 Oct 2022
CAB: Comprehensive Attention Benchmarking on Long Sequence Modeling
CAB: Comprehensive Attention Benchmarking on Long Sequence Modeling
Jinchao Zhang
Shuyang Jiang
Jiangtao Feng
Lin Zheng
Dianbo Sui
3DV
180
9
0
14 Oct 2022
Attention and Self-Attention in Random Forests
Attention and Self-Attention in Random Forests
Lev V. Utkin
A. Konstantinov
68
6
0
09 Jul 2022
1