ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.06210
  4. Cited By
Pruning Pre-trained Language Models Without Fine-Tuning
v1v2 (latest)

Pruning Pre-trained Language Models Without Fine-Tuning

12 October 2022
Ting Jiang
Deqing Wang
Fuzhen Zhuang
Ruobing Xie
Feng Xia
ArXiv (abs)PDFHTML

Papers citing "Pruning Pre-trained Language Models Without Fine-Tuning"

2 / 2 papers shown
Title
Beware of Calibration Data for Pruning Large Language Models
Beware of Calibration Data for Pruning Large Language Models
Yixin Ji
Yang Xiang
Juntao Li
Qingrong Xia
Ping Li
Xinyu Duan
Zhefeng Wang
Min Zhang
96
2
0
23 Oct 2024
MUX-PLMs: Data Multiplexing for High-throughput Language Models
MUX-PLMs: Data Multiplexing for High-throughput Language Models
Vishvak Murahari
Ameet Deshpande
Carlos E. Jimenez
Izhak Shafran
Mingqiu Wang
Yuan Cao
Karthik Narasimhan
MoE
61
5
0
24 Feb 2023
1