Skip to main content

Overcoming Posterior Collapse in Variational Autoencoders via EM-type Training

Ying Li (The University of Hong Kong); Lei Cheng (Zhejiang University); Feng Yin (The Chinese University of Hong Kong, Shenzhen); Michael Zhang (University of Hong Kong); Sergios Theodoridis (National and Kapodistrian University of Athens)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Variational autoencoders (VAE) are one of the most prominent deep generative models for learning the underlying statistical distribution of high-dimensional data. However, training VAEs suffers from a severe issue called posterior collapse; that is, the learned posterior distribution collapses to the assumed/pre-selected prior distribution. This issue limits the capacity of the learned posterior distribution to convey data information. Previous work has proposed a heuristic training scheme to mitigate this issue, in which the core idea is to train the encoder and the decoder in an alternating fashion. However, there is still no theoretical interpretation of this scheme, and this paper, for the first time, fills in this gap by inspecting the previous scheme under the lens of the expectation maximization (EM) framework. Under this framework, we propose a novel EM-type training algorithm that gives a controllable optimization process and it allows for further extensions, e.g., employing implicit distribution models. Experimental results have corroborated the superior performance of the proposed EM-type VAE training algorithm in terms of various metrics.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00