ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.06704
  4. Cited By
Towards Non-saturating Recurrent Units for Modelling Long-term
  Dependencies

Towards Non-saturating Recurrent Units for Modelling Long-term Dependencies

22 January 2019
A. Chandar
Chinnadhurai Sankar
Eugene Vorontsov
Samira Ebrahimi Kahou
Yoshua Bengio
ArXivPDFHTML

Papers citing "Towards Non-saturating Recurrent Units for Modelling Long-term Dependencies"

4 / 4 papers shown
Title
RWKV: Reinventing RNNs for the Transformer Era
RWKV: Reinventing RNNs for the Transformer Era
Bo Peng
Eric Alcaide
Quentin G. Anthony
Alon Albalak
Samuel Arcadinho
...
Qihang Zhao
P. Zhou
Qinghua Zhou
Jian Zhu
Rui-Jie Zhu
90
562
0
22 May 2023
Random orthogonal additive filters: a solution to the
  vanishing/exploding gradient of deep neural networks
Random orthogonal additive filters: a solution to the vanishing/exploding gradient of deep neural networks
Andrea Ceni
ODL
25
3
0
03 Oct 2022
Slower is Better: Revisiting the Forgetting Mechanism in LSTM for Slower
  Information Decay
Slower is Better: Revisiting the Forgetting Mechanism in LSTM for Slower Information Decay
H. Chien
Javier S. Turek
Nicole M. Beckage
Vy A. Vo
C. Honey
Ted Willke
16
15
0
12 May 2021
HiPPO: Recurrent Memory with Optimal Polynomial Projections
HiPPO: Recurrent Memory with Optimal Polynomial Projections
Albert Gu
Tri Dao
Stefano Ermon
Atri Rudra
Christopher Ré
54
491
0
17 Aug 2020
1