Diffusion models represent the cutting edge in image generation, but their high memory and computational demands hinder deployment on resource-constrained devices. Post-Training Quantization (PTQ) offers a promising solution by reducing the bitwidth of matrix operations. However, standard PTQ methods struggle with outliers, and achieving higher compression often requires transforming model weights and activations before quantization. In this work, we propose HadaNorm, a novel linear transformation that extends existing approaches by both normalizing channels activations and applying Hadamard transforms to effectively mitigate outliers and enable aggressive activation quantization. We demonstrate that HadaNorm consistently reduces quantization error across the various components of transformer blocks, outperforming state-of-the-art methods.
View on arXiv@article{federici2025_2506.09932, title={ HadaNorm: Diffusion Transformer Quantization through Mean-Centered Transformations }, author={ Marco Federici and Riccardo Del Chiaro and Boris van Breugel and Paul Whatmough and Markus Nagel }, journal={arXiv preprint arXiv:2506.09932}, year={ 2025 } }