Representation Decomposition For Image Manipulation And Beyond
Shang-Fu Chen, Jai-Wei Yan, Ya-Fan Su, Yu-Chiang Frank Wang
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:05:15
Representation disentanglement aims at learning interpretable features so that the output can be recovered or manipulated accordingly. While existing works like infoGAN and AC-GAN exist, they choose to derive disjoint attribute code for feature disentanglement, which is not applicable for existing/trained generative models. In this paper, we propose a decomposition-GAN (dec-GAN), which is able to achieve the decomposition of an existing latent representation into content and attribute features. Guided by the classifier pre-trained on the attributes of interest, our dec-GAN decomposes the attributes of interest from the latent representation, while data recovery and feature consistency objectives enforce the learning of our proposed method. Our experiments on multiple image datasets confirm the effectiveness and robustness of our dec-GAN over recent representation disentanglement models