Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:17:33
09 May 2022

Generalized zero-shot learning (GZSL) is a technique to train a deep learning model to identify unseen classes. Conventionally, conditional generative models have been employed to generate training data for unseen classes from the attribute. In this paper, we propose a new conditional generative model that improves the GZSL performance greatly. In a nutshell, the proposed model, called conditional Wasserstein autoencoder (CWAE), minimizes the Wasserstein distance between the real and generated image feature distributions by exploiting an encoder-decoder architecture. From extensive experiments, we show that the proposed CWAE outperforms conventional generative models in terms of the GZSL classification performance.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00