ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.09569
49
1
v1v2v3 (latest)

Towards deep neural network compression via learnable wavelet transforms

20 April 2020
Moritz Wolter
Shaohui Lin
Angela Yao
ArXiv (abs)PDFHTMLGithub (25★)
Abstract

Wavelets are well known for data compression, yet have rarely been applied to the compression of neural networks. In this paper, we show how the fast wavelet transform can be applied to compress linear layers in neural networks. Linear layers still occupy a significant portion of the parameters in recurrent neural networks (RNNs). Through our method, we can learn both the wavelet bases as well as corresponding coefficients to efficiently represent the linear layers of RNNs. Our wavelet compressed RNNs have significantly fewer parameters yet still perform competitively with state-of-the-art on both synthetic and real-world RNN benchmarks. Wavelet optimization adds basis flexibility, without large numbers of extra weights

View on arXiv
Comments on this paper