Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2001.08896
Cited By
Compressing Language Models using Doped Kronecker Products
24 January 2020
Urmish Thakker
Paul Whatamough
Zhi-Gang Liu
Matthew Mattina
Jesse G. Beu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Compressing Language Models using Doped Kronecker Products"
2 / 2 papers shown
Title
Kronecker Decomposition for GPT Compression
Ali Edalati
Marzieh S. Tahaei
Ahmad Rashid
V. Nia
J. Clark
Mehdi Rezagholizadeh
36
33
0
15 Oct 2021
KroneckerBERT: Learning Kronecker Decomposition for Pre-trained Language Models via Knowledge Distillation
Marzieh S. Tahaei
Ella Charlaix
V. Nia
A. Ghodsi
Mehdi Rezagholizadeh
46
22
0
13 Sep 2021
1