Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:15:15
08 May 2022

Facial attribute manipulation (FAM) aims to infer desired facial images by modifying specific attributes while keeping others unchanged. Existing works suffer from the entanglement of facial attributes, leading to unexpected artifacts and the loss of facial identity information after editing. To alleviate these issues, we propose a novel FAM framework based on StyleGAN, termed VR-FAM, which can meet the requirements of FAM---editing ability, distortion, and fidelity. First, we propose a variance-reduced encoder to make the latent space close to the one of StyleGAN. Second, we present a nonlinear latent transformation network, which can convert the source latent code to target latent code in line with the nonlinear latent space of StyleGAN. Experimentally, we evaluate the proposed FAM framework on the benchmark FFHQ dataset and demonstrate the improvement gain over the recently published models in terms of edit accuracy and fidelity.

More Like This

01 Feb 2024

P4.17-Generative Adversarial Networks

1.00 pdh 0.10 ceu
  • SPS
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00