ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.15485
81
0

Enhancing RWKV-based Language Models for Long-Sequence Text Generation

21 February 2025
Xinghan Pan
ArXivPDFHTML
Abstract

This paper introduces an enhanced RWKV architecture with adaptive temporal gating mechanisms for improved long-context language modeling. We propose two principal innovations: (1) a position-aware convolutional shift operator that captures local syntactic patterns while preserving global coherence, and (2) a neurally-gated information routing mechanism that dynamically regulates inter-token information flow. Through comprehensive experiments on text generation tasks, our enhanced model demonstrates superior performance compared to the baseline RWKV, achieving 96.5 relative improvement in ROUGE-L scores with only 2.95 increased inference latency. Ablation studies validate the individual contributions of each component, while linguistic analysis reveals the model's adaptive attention to syntactic boundaries and entity coherence. The proposed modifications maintain RWKV's linear computational complexity while significantly enhancing its contextual modeling capabilities, establishing new state-of-the-art performance for recurrent-style architectures in long-form text generation.

View on arXiv
@article{pan2025_2502.15485,
  title={ Enhancing RWKV-based Language Models for Long-Sequence Text Generation },
  author={ Xinghan Pan },
  journal={arXiv preprint arXiv:2502.15485},
  year={ 2025 }
}
Comments on this paper