Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:17:33
09 May 2022

Generalized zero-shot learning (GZSL) is a technique to train a deep learning model to identify unseen classes. Conventionally, conditional generative models have been employed to generate training data for unseen classes from the attribute. In this paper, we propose a new conditional generative model that improves the GZSL performance greatly. In a nutshell, the proposed model, called conditional Wasserstein autoencoder (CWAE), minimizes the Wasserstein distance between the real and generated image feature distributions by exploiting an encoder-decoder architecture. From extensive experiments, we show that the proposed CWAE outperforms conventional generative models in terms of the GZSL classification performance.

More Like This

18 Oct 2024

Tutorial: Deep Generative Model for Inference

3.00 pdh 0.30 ceu
  • SPS
    Members: $49.00
    IEEE Members: $59.00
    Non-members: $69.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00