Energy-Based Generative Adversarial Networks

Energy-Based Generative Adversarial Networks

6 Mar 2017 | Junbo Zhao, Michael Mathieu and Yann LeCun
The paper introduces the Energy-Based Generative Adversarial Network (EBGAN) model, which views the discriminator as an energy function that assigns low energies to regions near the data manifold and higher energies to other regions. The generator is trained to produce contrastive samples with minimal energies, while the discriminator is trained to assign high energies to these generated samples. This approach allows for a wider range of architectures and loss functions, including an auto-encoder architecture where the energy is the reconstruction error. The authors demonstrate that this form of EBGAN exhibits more stable behavior during training compared to regular GANs and can generate high-resolution images from the ImageNet dataset. The main contributions include an energy-based formulation for GAN training, a proof that under a simple hinge loss, the generator of EBGAN produces points that follow the underlying data distribution, and a set of experiments exploring hyper-parameters and architectural choices. The paper also discusses the theoretical analysis of the system, the use of auto-encoders, and related work, providing a comprehensive overview of the EBGAN framework and its applications.The paper introduces the Energy-Based Generative Adversarial Network (EBGAN) model, which views the discriminator as an energy function that assigns low energies to regions near the data manifold and higher energies to other regions. The generator is trained to produce contrastive samples with minimal energies, while the discriminator is trained to assign high energies to these generated samples. This approach allows for a wider range of architectures and loss functions, including an auto-encoder architecture where the energy is the reconstruction error. The authors demonstrate that this form of EBGAN exhibits more stable behavior during training compared to regular GANs and can generate high-resolution images from the ImageNet dataset. The main contributions include an energy-based formulation for GAN training, a proof that under a simple hinge loss, the generator of EBGAN produces points that follow the underlying data distribution, and a set of experiments exploring hyper-parameters and architectural choices. The paper also discusses the theoretical analysis of the system, the use of auto-encoders, and related work, providing a comprehensive overview of the EBGAN framework and its applications.
Reach us at info@study.space