ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.12127
  4. Cited By
SpeedLimit: Neural Architecture Search for Quantized Transformer Models

SpeedLimit: Neural Architecture Search for Quantized Transformer Models

25 September 2022
Yuji Chai
Luke Bailey
Yunho Jin
Matthew Karle
Glenn G. Ko
David Brooks
Gu-Yeon Wei
H. T. Kung
    MQ
ArXivPDFHTML

Papers citing "SpeedLimit: Neural Architecture Search for Quantized Transformer Models"

2 / 2 papers shown
Title
I-BERT: Integer-only BERT Quantization
I-BERT: Integer-only BERT Quantization
Sehoon Kim
A. Gholami
Z. Yao
Michael W. Mahoney
Kurt Keutzer
MQ
105
341
0
05 Jan 2021
Scaling Laws for Neural Language Models
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
261
4,489
0
23 Jan 2020
1