ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.11771
  4. Cited By
Encoding-based Memory Modules for Recurrent Neural Networks

Encoding-based Memory Modules for Recurrent Neural Networks

31 January 2020
Antonio Carta
A. Sperduti
D. Bacciu
    KELM
ArXivPDFHTML

Papers citing "Encoding-based Memory Modules for Recurrent Neural Networks"

5 / 5 papers shown
Title
Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks
Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks
Victor Campos
Brendan Jou
Xavier Giró-i-Nieto
Jordi Torres
Shih-Fu Chang
39
217
0
22 Aug 2017
Recurrent Orthogonal Networks and Long-Memory Tasks
Recurrent Orthogonal Networks and Long-Memory Tasks
Mikael Henaff
Arthur Szlam
Yann LeCun
44
131
0
22 Feb 2016
Unitary Evolution Recurrent Neural Networks
Unitary Evolution Recurrent Neural Networks
Martín Arjovsky
Amar Shah
Yoshua Bengio
ODL
41
769
0
20 Nov 2015
Empirical Evaluation of Gated Recurrent Neural Networks on Sequence
  Modeling
Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling
Junyoung Chung
Çağlar Gülçehre
Kyunghyun Cho
Yoshua Bengio
281
12,662
0
11 Dec 2014
Modeling Temporal Dependencies in High-Dimensional Sequences:
  Application to Polyphonic Music Generation and Transcription
Modeling Temporal Dependencies in High-Dimensional Sequences: Application to Polyphonic Music Generation and Transcription
Nicolas Boulanger-Lewandowski
Yoshua Bengio
Pascal Vincent
103
700
0
27 Jun 2012
1