ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.08896
  4. Cited By
Compressing Language Models using Doped Kronecker Products

Compressing Language Models using Doped Kronecker Products

24 January 2020
Urmish Thakker
Paul Whatamough
Zhi-Gang Liu
Matthew Mattina
Jesse G. Beu
ArXivPDFHTML

Papers citing "Compressing Language Models using Doped Kronecker Products"

2 / 2 papers shown
Title
Kronecker Decomposition for GPT Compression
Kronecker Decomposition for GPT Compression
Ali Edalati
Marzieh S. Tahaei
Ahmad Rashid
V. Nia
J. Clark
Mehdi Rezagholizadeh
36
33
0
15 Oct 2021
KroneckerBERT: Learning Kronecker Decomposition for Pre-trained Language
  Models via Knowledge Distillation
KroneckerBERT: Learning Kronecker Decomposition for Pre-trained Language Models via Knowledge Distillation
Marzieh S. Tahaei
Ella Charlaix
V. Nia
A. Ghodsi
Mehdi Rezagholizadeh
46
22
0
13 Sep 2021
1