ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.09471
  4. Cited By
Untangling tradeoffs between recurrence and self-attention in neural
  networks

Untangling tradeoffs between recurrence and self-attention in neural networks

16 June 2020
Giancarlo Kerg
Bhargav Kanuparthi
Anirudh Goyal
Kyle Goyette
Yoshua Bengio
Guillaume Lajoie
ArXivPDFHTML

Papers citing "Untangling tradeoffs between recurrence and self-attention in neural networks"

5 / 5 papers shown
Title
Attention Mechanism in Neural Networks: Where it Comes and Where it Goes
Attention Mechanism in Neural Networks: Where it Comes and Where it Goes
Derya Soydaner
3DV
49
150
0
27 Apr 2022
Continuous-Time Audiovisual Fusion with Recurrence vs. Attention for
  In-The-Wild Affect Recognition
Continuous-Time Audiovisual Fusion with Recurrence vs. Attention for In-The-Wild Affect Recognition
Vincent Karas
M. Tellamekala
Adria Mallol-Ragolta
M. Valstar
Björn W. Schuller
30
13
0
24 Mar 2022
Attention is All You Need in Speech Separation
Attention is All You Need in Speech Separation
Cem Subakan
Mirco Ravanelli
Samuele Cornell
Mirko Bronzi
Jianyuan Zhong
47
542
0
25 Oct 2020
Deriving Differential Target Propagation from Iterating Approximate
  Inverses
Deriving Differential Target Propagation from Iterating Approximate Inverses
Yoshua Bengio
20
25
0
29 Jul 2020
A Decomposable Attention Model for Natural Language Inference
A Decomposable Attention Model for Natural Language Inference
Ankur P. Parikh
Oscar Täckström
Dipanjan Das
Jakob Uszkoreit
216
1,367
0
06 Jun 2016
1