Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:04:10
28 Mar 2022

One-class classification (OCC) methods for abnormality detection learn either a generative model of the inlier class (e.g., using variants of kernel principal component analysis) or a decision boundary to encapsulate the inlier class (e.g., using one-class variants of the support vector machine). Recent methods use deep-neural-network models to learn (for the inlier class) either latent-space distributions or autoencoders, but not both. OCC learning typically relies solely on inlier-class data, but some recent semi-supervised versions also leverage some outlier-class training data. We propose a robust and uncertainty-aware variational framework for OCC, leveraging data-adaptive generalized-Gaussian (GG) models leading to distribution modeling in both latent space and image space. We propose a reparameterization for samples from the latent-space GG to enable backpropagation. Results on publicly available real-world datasets show the benefits of our method over others.

Value-Added Bundle(s) Including this Product