Incorporate Maximum Mean Discrepancy In Recurrent Latent Space For Sequential Generative Model
Yuchi Zhang, Yongliang Wang, Yang Dong
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:19:01
Stochastic recurrent neural networks have shown promising performance for modeling complex sequences. Nonetheless, existing methods adopt KL divergence as distribution regularizations in their latent spaces, which limits the choices of models for latent distribution construction. In this paper, we incorporate maximum mean discrepancy in the recurrent structure for distribution regularization. Maximum mean discrepancy is able to measure the difference between two distributions by just sampling from them, which enables us to construct more complicated latent distributions by neural networks. Therefore, our proposed algorithm is able to model more complex sequences. Experiments conducted on two different sequential modeling tasks show that our method outperforms the state-of-the-art sequential modeling algorithms.
Chairs:
Jen-Tzung Chien