ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1707.01209
  4. Cited By
Model compression as constrained optimization, with application to
  neural nets. Part I: general framework

Model compression as constrained optimization, with application to neural nets. Part I: general framework

5 July 2017
Miguel Á. Carreira-Perpiñán
    MQ
ArXivPDFHTML

Papers citing "Model compression as constrained optimization, with application to neural nets. Part I: general framework"

4 / 4 papers shown
Title
On Model Compression for Neural Networks: Framework, Algorithm, and
  Convergence Guarantee
On Model Compression for Neural Networks: Framework, Algorithm, and Convergence Guarantee
Chenyang Li
Jihoon Chung
Mengnan Du
Haimin Wang
Xianlian Zhou
Bohao Shen
33
1
0
13 Mar 2023
ProxQuant: Quantized Neural Networks via Proximal Operators
ProxQuant: Quantized Neural Networks via Proximal Operators
Yu Bai
Yu-Xiang Wang
Edo Liberty
MQ
11
117
0
01 Oct 2018
Blended Coarse Gradient Descent for Full Quantization of Deep Neural
  Networks
Blended Coarse Gradient Descent for Full Quantization of Deep Neural Networks
Penghang Yin
Shuai Zhang
J. Lyu
Stanley Osher
Y. Qi
Jack Xin
MQ
44
61
0
15 Aug 2018
Model compression as constrained optimization, with application to
  neural nets. Part II: quantization
Model compression as constrained optimization, with application to neural nets. Part II: quantization
M. A. Carreira-Perpiñán
Yerlan Idelbayev
MQ
28
37
0
13 Jul 2017
1