ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.14230
40
10

Transforming the Latent Space of StyleGAN for Real Face Editing

29 May 2021
Heyi Li
Jinlong Liu
Xinyu Zhang
Yunzhi Bai
Huayan Wang
Klaus Mueller
    CVBM
ArXivPDFHTML
Abstract

Despite recent advances in semantic manipulation using StyleGAN, semantic editing of real faces remains challenging. The gap between the WWW space and the WWW+ space demands an undesirable trade-off between reconstruction quality and editing quality. To solve this problem, we propose to expand the latent space by replacing fully-connected layers in the StyleGAN's mapping network with attention-based transformers. This simple and effective technique integrates the aforementioned two spaces and transforms them into one new latent space called WWW++. Our modified StyleGAN maintains the state-of-the-art generation quality of the original StyleGAN with moderately better diversity. But more importantly, the proposed WWW++ space achieves superior performance in both reconstruction quality and editing quality. Despite these significant advantages, our WWW++ space supports existing inversion algorithms and editing methods with only negligible modifications thanks to its structural similarity with the W/WW/WW/W+ space. Extensive experiments on the FFHQ dataset prove that our proposed WWW++ space is evidently more preferable than the previous W/WW/WW/W+ space for real face editing. The code is publicly available for research purposes at https://github.com/AnonSubm2021/TransStyleGAN.

View on arXiv
Comments on this paper