Skip to content

NCTUMLlab/Chun-Lin-Kuo-Mixtures-of-Gaussian

Repository files navigation

Variational Bayesian GAN for Mixtures-of-Gaussian

To test the ability to infer a multi-modal posterior which helps avoiding the mode collapse, an instance of over-fitting in regular GAN training. We generate 2-dimensional synthetic data by mixtures of 8 Gaussian distribution scattered equality on circumference. We run the synthetic data on our proposed, vanilla GAN and Bayesian GAN, showing 100 samples from the models every 10000 iterations and find out that our proposed and Bayesian GAN are more powerful to capture the modes of mixtures of Gaussian avoiding the mode collapse.

Bayesian GAN credit to https://github.com/vasiloglou/mltrain-nips-2017/blob/master/ben_athiwaratkun/pytorch-bayesgan/Bayesian%20GAN%20in%20PyTorch.ipynb

  • Our Model Architecture

VBGAN

VBGAN with wasserstein metric

Setting

  • Framework:
    • Pytorch 0.4.0
  • Hardware:
    • CPU: Intel Core i7-2600 @3.40 GHz
    • RAM: 20 GB DDR4-2400
    • GPU: GeForce GTX 980

Result of Vanilla GAN and VBGAN

vanilla GAN (left) and our proposed (right)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages