ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.13427
  4. Cited By
xLSTM 7B: A Recurrent LLM for Fast and Efficient Inference

xLSTM 7B: A Recurrent LLM for Fast and Efficient Inference

17 March 2025
M. Beck
Korbinian Poppel
Phillip Lippe
Richard Kurle
P. Blies
Günter Klambauer
Sebastian Böck
Sepp Hochreiter
    LRM
ArXivPDFHTML

Papers citing "xLSTM 7B: A Recurrent LLM for Fast and Efficient Inference"

1 / 1 papers shown
Title
Overflow Prevention Enhances Long-Context Recurrent LLMs
Overflow Prevention Enhances Long-Context Recurrent LLMs
Assaf Ben-Kish
Itamar Zimerman
M. Jehanzeb Mirza
James R. Glass
Leonid Karlinsky
Raja Giryes
LRM
32
0
0
12 May 2025
1