ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.01697
  4. Cited By
Low-Rank+Sparse Tensor Compression for Neural Networks

Low-Rank+Sparse Tensor Compression for Neural Networks

2 November 2021
Cole Hawkins
Haichuan Yang
Meng Li
Liangzhen Lai
Vikas Chandra
ArXivPDFHTML

Papers citing "Low-Rank+Sparse Tensor Compression for Neural Networks"

4 / 4 papers shown
Title
LoSparse: Structured Compression of Large Language Models based on
  Low-Rank and Sparse Approximation
LoSparse: Structured Compression of Large Language Models based on Low-Rank and Sparse Approximation
Yixiao Li
Yifan Yu
Qingru Zhang
Chen Liang
Pengcheng He
Weizhu Chen
Tuo Zhao
44
69
0
20 Jun 2023
How Informative is the Approximation Error from Tensor Decomposition for
  Neural Network Compression?
How Informative is the Approximation Error from Tensor Decomposition for Neural Network Compression?
Jetze T. Schuurmans
Kim Batselier
Julian F. P. Kooij
21
1
0
09 May 2023
What is the State of Neural Network Pruning?
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
191
1,032
0
06 Mar 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,599
0
17 Apr 2017
1