ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.08903
  4. Cited By
Delta-CoMe: Training-Free Delta-Compression with Mixed-Precision for
  Large Language Models

Delta-CoMe: Training-Free Delta-Compression with Mixed-Precision for Large Language Models

13 June 2024
Bowen Ping
Shuo Wang
Hanqing Wang
Xu Han
Yuzhuang Xu
Yukun Yan
Yun Chen
Baobao Chang
Zhiyuan Liu
Maosong Sun
    MQ
ArXivPDFHTML

Papers citing "Delta-CoMe: Training-Free Delta-Compression with Mixed-Precision for Large Language Models"

1 / 1 papers shown
Title
Agile-Quant: Activation-Guided Quantization for Faster Inference of LLMs on the Edge
Agile-Quant: Activation-Guided Quantization for Faster Inference of LLMs on the Edge
Xuan Shen
Peiyan Dong
Lei Lu
Zhenglun Kong
Zhengang Li
Ming Lin
Chao Wu
Yanzhi Wang
MQ
39
24
0
09 Dec 2023
1