ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.00557
23
1

STanH : Parametric Quantization for Variable Rate Learned Image Compression

1 October 2024
Alberto Presta
Enzo Tartaglione
Attilio Fiandrotti
Marco Grangetto
    MQ
ArXivPDFHTML
Abstract

In end-to-end learned image compression, encoder and decoder are jointly trained to minimize a R+λDR + {\lambda}DR+λD cost function, where λ{\lambda}λ controls the trade-off between rate of the quantized latent representation and image quality. Unfortunately, a distinct encoder-decoder pair with millions of parameters must be trained for each λ{\lambda}λ, hence the need to switch encoders and to store multiple encoders and decoders on the user device for every target rate. This paper proposes to exploit a differentiable quantizer designed around a parametric sum of hyperbolic tangents, called STanH , that relaxes the step-wise quantization function. STanH is implemented as a differentiable activation layer with learnable quantization parameters that can be plugged into a pre-trained fixed rate model and refined to achieve different target bitrates. Experimental results show that our method enables variable rate coding with comparable efficiency to the state-of-the-art, yet with significant savings in terms of ease of deployment, training time, and storage costs

View on arXiv
Comments on this paper