ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.13649
  4. Cited By
SCOPE: Optimizing Key-Value Cache Compression in Long-context Generation

SCOPE: Optimizing Key-Value Cache Compression in Long-context Generation

18 December 2024
Jialong Wu
Zhenglin Wang
Linhai Zhang
Yilong Lai
Yulan He
Deyu Zhou
ArXivPDFHTML

Papers citing "SCOPE: Optimizing Key-Value Cache Compression in Long-context Generation"

1 / 1 papers shown
Title
Can LLMs Maintain Fundamental Abilities under KV Cache Compression?
Can LLMs Maintain Fundamental Abilities under KV Cache Compression?
Xiang Liu
Zhenheng Tang
Hong Chen
Peijie Dong
Zeyu Li
Xiuze Zhou
Bo Li
Xuming Hu
Xiaowen Chu
383
5
0
04 Feb 2025
1