Improved Techniques for Training Score-Based Generative Models

Improved Techniques for Training Score-Based Generative Models

23 Oct 2020 | Yang Song, Stefano Ermon
Score-based generative models can produce high-quality image samples comparable to GANs without adversarial optimization, but are limited to low-resolution images ( typically below 32x32). This paper introduces new techniques to scale these models to higher resolutions (64x64 to 256x256) and improve stability. The authors propose a method to analytically compute effective Gaussian noise scales from training data and an efficient architecture to amortize the score estimation task across multiple noise scales. They also derive techniques to optimize the performance of Langevin dynamics sampling and apply exponential moving averages to model parameters. These improvements enable score-based models to generate high-fidelity samples that rival best-in-class GANs on various datasets, including CelebA, FFHQ, and several LSUN categories. The paper includes theoretical analyses, experimental results, and ablation studies to support these findings.Score-based generative models can produce high-quality image samples comparable to GANs without adversarial optimization, but are limited to low-resolution images ( typically below 32x32). This paper introduces new techniques to scale these models to higher resolutions (64x64 to 256x256) and improve stability. The authors propose a method to analytically compute effective Gaussian noise scales from training data and an efficient architecture to amortize the score estimation task across multiple noise scales. They also derive techniques to optimize the performance of Langevin dynamics sampling and apply exponential moving averages to model parameters. These improvements enable score-based models to generate high-fidelity samples that rival best-in-class GANs on various datasets, including CelebA, FFHQ, and several LSUN categories. The paper includes theoretical analyses, experimental results, and ablation studies to support these findings.
Reach us at info@study.space