ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.11417
  4. Cited By
Parallelizing Legendre Memory Unit Training

Parallelizing Legendre Memory Unit Training

22 February 2021
Narsimha Chilkuri
C. Eliasmith
ArXivPDFHTML

Papers citing "Parallelizing Legendre Memory Unit Training"

5 / 5 papers shown
Title
NeuroBench: A Framework for Benchmarking Neuromorphic Computing Algorithms and Systems
NeuroBench: A Framework for Benchmarking Neuromorphic Computing Algorithms and Systems
Jason Yik
Korneel Van den Berghe
Douwe den Blanken
Younes Bouhadjar
Maxime Fabre
...
Fatima Tuz Zohora
Charlotte Frenkel
Vijay Janapa Reddi
Charlotte Frenkel
Vijay Janapa Reddi
36
19
0
10 Apr 2023
Sequence Learning Using Equilibrium Propagation
Sequence Learning Using Equilibrium Propagation
Malyaban Bal
Abhronil Sengupta
35
9
0
14 Sep 2022
Efficiently Modeling Long Sequences with Structured State Spaces
Efficiently Modeling Long Sequences with Structured State Spaces
Albert Gu
Karan Goel
Christopher Ré
52
1,680
0
31 Oct 2021
FlexConv: Continuous Kernel Convolutions with Differentiable Kernel
  Sizes
FlexConv: Continuous Kernel Convolutions with Differentiable Kernel Sizes
David W. Romero
Robert-Jan Bruintjes
Jakub M. Tomczak
Erik J. Bekkers
Mark Hoogendoorn
Jan van Gemert
80
82
0
15 Oct 2021
Language Modeling using LMUs: 10x Better Data Efficiency or Improved
  Scaling Compared to Transformers
Language Modeling using LMUs: 10x Better Data Efficiency or Improved Scaling Compared to Transformers
Narsimha Chilkuri
Eric Hunsberger
Aaron R. Voelker
G. Malik
C. Eliasmith
38
7
0
05 Oct 2021
1