ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.06704
  4. Cited By
Towards Non-saturating Recurrent Units for Modelling Long-term
  Dependencies

Towards Non-saturating Recurrent Units for Modelling Long-term Dependencies

22 January 2019
A. Chandar
Chinnadhurai Sankar
Eugene Vorontsov
Samira Ebrahimi Kahou
Yoshua Bengio
ArXivPDFHTML

Papers citing "Towards Non-saturating Recurrent Units for Modelling Long-term Dependencies"

3 / 3 papers shown
Title
RWKV: Reinventing RNNs for the Transformer Era
RWKV: Reinventing RNNs for the Transformer Era
Bo Peng
Eric Alcaide
Quentin G. Anthony
Alon Albalak
Samuel Arcadinho
...
Qihang Zhao
P. Zhou
Qinghua Zhou
Jian Zhu
Rui-Jie Zhu
90
562
0
22 May 2023
Random orthogonal additive filters: a solution to the
  vanishing/exploding gradient of deep neural networks
Random orthogonal additive filters: a solution to the vanishing/exploding gradient of deep neural networks
Andrea Ceni
ODL
25
3
0
03 Oct 2022
HiPPO: Recurrent Memory with Optimal Polynomial Projections
HiPPO: Recurrent Memory with Optimal Polynomial Projections
Albert Gu
Tri Dao
Stefano Ermon
Atri Rudra
Christopher Ré
33
489
0
17 Aug 2020
1