ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.14058
  4. Cited By
Efficient Attention Network: Accelerate Attention by Searching Where to
  Plug

Efficient Attention Network: Accelerate Attention by Searching Where to Plug

28 November 2020
Zhongzhan Huang
Senwei Liang
Mingfu Liang
Wei He
Haizhao Yang
ArXivPDFHTML

Papers citing "Efficient Attention Network: Accelerate Attention by Searching Where to Plug"

3 / 3 papers shown
Title
LSAS: Lightweight Sub-attention Strategy for Alleviating Attention Bias
  Problem
LSAS: Lightweight Sub-attention Strategy for Alleviating Attention Bias Problem
Shan Zhong
Wushao Wen
Jinghui Qin
Qiang Chen
Zhongzhan Huang
34
7
0
09 May 2023
Mix-Pooling Strategy for Attention Mechanism
Mix-Pooling Strategy for Attention Mechanism
Shan Zhong
Wushao Wen
Jinghui Qin
33
3
0
22 Aug 2022
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
274
5,331
0
05 Nov 2016
1