Wasserstein GAN

Wasserstein GAN

6 Dec 2017 | Martin Arjovsky, Soumith Chintala, and Léon Bottou
The paper introduces the Wasserstein GAN (WGAN) as an alternative to traditional Generative Adversarial Networks (GANs). The authors address the limitations of GANs, particularly the instability and mode collapse issues, by proposing a new objective function that minimizes the Earth Mover distance (EM distance), a measure of the cost of transporting one distribution to another. The EM distance is shown to be more suitable for distributions supported on low-dimensional manifolds, making it easier to optimize and converge. Key contributions of the paper include: 1. **Theoretical Analysis**: A comprehensive analysis of how the EM distance behaves compared to other popular probability distances and divergences. 2. **Wasserstein GAN (WGAN)**: Definition and theoretical justification of WGAN, which minimizes an efficient approximation of the EM distance. 3. **Empirical Results**: Demonstrates that WGANs improve upon GANs in terms of stability, sample quality, and the ability to train complex architectures without mode collapse. The paper also discusses related work, including Integral Probability Metrics (IPMs), Maximum Mean Discrepancy (MMD), and Generative Moment Matching Networks (GMMNs), highlighting the advantages and limitations of each approach. The authors conclude that WGANs offer a more robust and stable alternative to traditional GANs, with practical benefits in image generation tasks.The paper introduces the Wasserstein GAN (WGAN) as an alternative to traditional Generative Adversarial Networks (GANs). The authors address the limitations of GANs, particularly the instability and mode collapse issues, by proposing a new objective function that minimizes the Earth Mover distance (EM distance), a measure of the cost of transporting one distribution to another. The EM distance is shown to be more suitable for distributions supported on low-dimensional manifolds, making it easier to optimize and converge. Key contributions of the paper include: 1. **Theoretical Analysis**: A comprehensive analysis of how the EM distance behaves compared to other popular probability distances and divergences. 2. **Wasserstein GAN (WGAN)**: Definition and theoretical justification of WGAN, which minimizes an efficient approximation of the EM distance. 3. **Empirical Results**: Demonstrates that WGANs improve upon GANs in terms of stability, sample quality, and the ability to train complex architectures without mode collapse. The paper also discusses related work, including Integral Probability Metrics (IPMs), Maximum Mean Discrepancy (MMD), and Generative Moment Matching Networks (GMMNs), highlighting the advantages and limitations of each approach. The authors conclude that WGANs offer a more robust and stable alternative to traditional GANs, with practical benefits in image generation tasks.
Reach us at info@study.space