GENERALIZED ZERO-SHOT LEARNING USING CONDITIONAL WASSERSTEIN AUTOENCODER
Junhan Kim, Byonghyo Shim
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:17:33
Generalized zero-shot learning (GZSL) is a technique to train a deep learning model to identify unseen classes. Conventionally, conditional generative models have been employed to generate training data for unseen classes from the attribute. In this paper, we propose a new conditional generative model that improves the GZSL performance greatly. In a nutshell, the proposed model, called conditional Wasserstein autoencoder (CWAE), minimizes the Wasserstein distance between the real and generated image feature distributions by exploiting an encoder-decoder architecture. From extensive experiments, we show that the proposed CWAE outperforms conventional generative models in terms of the GZSL classification performance.