ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.09932
47
0
v1v2 (latest)

HadaNorm: Diffusion Transformer Quantization through Mean-Centered Transformations

11 June 2025
Marco Federici
Riccardo Del Chiaro
Boris van Breugel
Paul N. Whatmough
Markus Nagel
    MQ
ArXiv (abs)PDFHTML
Main:4 Pages
6 Figures
Bibliography:2 Pages
2 Tables
Appendix:2 Pages
Abstract

Diffusion models represent the cutting edge in image generation, but their high memory and computational demands hinder deployment on resource-constrained devices. Post-Training Quantization (PTQ) offers a promising solution by reducing the bitwidth of matrix operations. However, standard PTQ methods struggle with outliers, and achieving higher compression often requires transforming model weights and activations before quantization. In this work, we propose HadaNorm, a novel linear transformation that extends existing approaches by both normalizing channels activations and applying Hadamard transforms to effectively mitigate outliers and enable aggressive activation quantization. We demonstrate that HadaNorm consistently reduces quantization error across the various components of transformer blocks, outperforming state-of-the-art methods.

View on arXiv
@article{federici2025_2506.09932,
  title={ HadaNorm: Diffusion Transformer Quantization through Mean-Centered Transformations },
  author={ Marco Federici and Riccardo Del Chiaro and Boris van Breugel and Paul Whatmough and Markus Nagel },
  journal={arXiv preprint arXiv:2506.09932},
  year={ 2025 }
}
Comments on this paper