18 Feb 2021 | Alex Nichol * 1 Prafulla Dhariwal * 1
This paper presents improvements to Denoising Diffusion Probabilistic Models (DDPMs), a class of generative models that have shown excellent sample quality but have not yet achieved competitive log-likelihoods. The authors introduce several modifications to enhance the log-likelihood while maintaining high sample quality. These modifications include learning the variance of the reverse diffusion process, which allows for faster sampling with a negligible loss in quality. The authors also compare the distribution coverage of DDPMs and GANs using precision and recall metrics, finding that DDPMs achieve higher recall for similar FID scores, indicating better mode coverage. Additionally, the paper explores the scalability of DDPMs with increasing model capacity and training compute, showing that performance improves smoothly. The authors release their code at <https://github.com/openai/improved-diffusion>.This paper presents improvements to Denoising Diffusion Probabilistic Models (DDPMs), a class of generative models that have shown excellent sample quality but have not yet achieved competitive log-likelihoods. The authors introduce several modifications to enhance the log-likelihood while maintaining high sample quality. These modifications include learning the variance of the reverse diffusion process, which allows for faster sampling with a negligible loss in quality. The authors also compare the distribution coverage of DDPMs and GANs using precision and recall metrics, finding that DDPMs achieve higher recall for similar FID scores, indicating better mode coverage. Additionally, the paper explores the scalability of DDPMs with increasing model capacity and training compute, showing that performance improves smoothly. The authors release their code at <https://github.com/openai/improved-diffusion>.