We introduce Color Disentangled Style Transfer (CDST), a novel and efficient two-stream style transfer training paradigm which completely isolates color from style and forces the style stream to be color-blinded. With one same model, CDST unlocks universal style transfer capabilities in a tuning-free manner during inference. Especially, the characteristics-preserved style transfer with style and content references is solved in the tuning-free way for the first time. CDST significantly improves the style similarity by multi-feature image embeddings compression and preserves strong editing capability via our new CDST style definition inspired by Diffusion UNet disentanglement law. By conducting thorough qualitative and quantitative experiments and human evaluations, we demonstrate that CDST achieves state-of-the-art results on various style transfer tasks.
View on arXiv@article{zhang2025_2506.13770, title={ CDST: Color Disentangled Style Transfer for Universal Style Reference Customization }, author={ Shiwen Zhang and Zhuowei Chen and Lang Chen and Yanze Wu }, journal={arXiv preprint arXiv:2506.13770}, year={ 2025 } }