ResearchTrend.AI
  • Papers
  • Communities
  • Organizations
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2507.18073
  4. Cited By
Squeeze10-LLM: Squeezing LLMs' Weights by 10 Times via a Staged Mixed-Precision Quantization Method

Squeeze10-LLM: Squeezing LLMs' Weights by 10 Times via a Staged Mixed-Precision Quantization Method

24 July 2025
Qingcheng Zhu
Yangyang Ren
L. Yang
Mingbao Lin
Yanjing Li
Sheng Xu
Zichao Feng
Haodong Zhu
Yuguang Yang
Juan Zhang
Runqi Wang
Baochang Zhang
    MQ
ArXiv (abs)PDFHTML

Papers citing "Squeeze10-LLM: Squeezing LLMs' Weights by 10 Times via a Staged Mixed-Precision Quantization Method"

Title

No papers found