ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1711.04686
  4. Cited By
Weightless: Lossy Weight Encoding For Deep Neural Network Compression

Weightless: Lossy Weight Encoding For Deep Neural Network Compression

13 November 2017
Brandon Reagen
Udit Gupta
Bob Adolf
Michael Mitzenmacher
Alexander M. Rush
Gu-Yeon Wei
David Brooks
ArXivPDFHTML

Papers citing "Weightless: Lossy Weight Encoding For Deep Neural Network Compression"

4 / 4 papers shown
Title
Compacting Deep Neural Networks for Internet of Things: Methods and
  Applications
Compacting Deep Neural Networks for Internet of Things: Methods and Applications
Ke Zhang
Hanbo Ying
Hongning Dai
Lin Li
Yuangyuang Peng
Keyi Guo
Hongfang Yu
28
38
0
20 Mar 2021
T-Basis: a Compact Representation for Neural Networks
T-Basis: a Compact Representation for Neural Networks
Anton Obukhov
M. Rakhuba
Stamatios Georgoulis
Menelaos Kanakis
Dengxin Dai
Luc Van Gool
41
27
0
13 Jul 2020
Bringing Giant Neural Networks Down to Earth with Unlabeled Data
Bringing Giant Neural Networks Down to Earth with Unlabeled Data
Yehui Tang
Shan You
Chang Xu
Boxin Shi
Chao Xu
24
11
0
13 Jul 2019
DeepSZ: A Novel Framework to Compress Deep Neural Networks by Using
  Error-Bounded Lossy Compression
DeepSZ: A Novel Framework to Compress Deep Neural Networks by Using Error-Bounded Lossy Compression
Sian Jin
Sheng Di
Xin Liang
Jiannan Tian
Dingwen Tao
Franck Cappello
AI4CE
24
59
0
26 Jan 2019
1