SCORE-BASED GENERATIVE MODELING THROUGH STOCHASTIC DIFFERENTIAL EQUATIONS

SCORE-BASED GENERATIVE MODELING THROUGH STOCHASTIC DIFFERENTIAL EQUATIONS

10 Feb 2021 | Yang Song*, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, Ben Poole
This paper introduces a framework for score-based generative modeling using stochastic differential equations (SDEs). The framework involves a forward SDE that gradually transforms a complex data distribution into a known prior distribution by adding noise, and a reverse-time SDE that transforms the prior distribution back into the data distribution by removing noise. The reverse-time SDE depends only on the score (gradient of the log probability density) of the data distribution. By leveraging advances in score-based generative modeling, the framework allows for accurate estimation of scores using neural networks and numerical SDE solvers to generate samples. The framework encapsulates previous approaches in score-based generative modeling and diffusion probabilistic modeling, enabling new sampling procedures and modeling capabilities. The paper introduces a predictor-corrector framework to correct errors in the evolution of the discretized reverse-time SDE. It also derives an equivalent neural ODE that samples from the same distribution as the SDE but enables exact likelihood computation and improved sampling efficiency. The framework is applied to various tasks, including class-conditional generation, image inpainting, and colorization, achieving record-breaking performance on CIFAR-10 with an Inception score of 9.89 and FID of 2.20. The paper also discusses the theoretical and practical contributions of the framework, including flexible sampling and likelihood computation, controllable generation, and a unified framework for exploring and tuning various SDEs. The framework is shown to achieve new state-of-the-art results in likelihood computation and sampling efficiency.This paper introduces a framework for score-based generative modeling using stochastic differential equations (SDEs). The framework involves a forward SDE that gradually transforms a complex data distribution into a known prior distribution by adding noise, and a reverse-time SDE that transforms the prior distribution back into the data distribution by removing noise. The reverse-time SDE depends only on the score (gradient of the log probability density) of the data distribution. By leveraging advances in score-based generative modeling, the framework allows for accurate estimation of scores using neural networks and numerical SDE solvers to generate samples. The framework encapsulates previous approaches in score-based generative modeling and diffusion probabilistic modeling, enabling new sampling procedures and modeling capabilities. The paper introduces a predictor-corrector framework to correct errors in the evolution of the discretized reverse-time SDE. It also derives an equivalent neural ODE that samples from the same distribution as the SDE but enables exact likelihood computation and improved sampling efficiency. The framework is applied to various tasks, including class-conditional generation, image inpainting, and colorization, achieving record-breaking performance on CIFAR-10 with an Inception score of 9.89 and FID of 2.20. The paper also discusses the theoretical and practical contributions of the framework, including flexible sampling and likelihood computation, controllable generation, and a unified framework for exploring and tuning various SDEs. The framework is shown to achieve new state-of-the-art results in likelihood computation and sampling efficiency.
Reach us at info@study.space
Understanding Score-Based Generative Modeling through Stochastic Differential Equations