CONTRACTIVE DIFFUSION PROBABILISTIC MODELS

CONTRACTIVE DIFFUSION PROBABILISTIC MODELS

May 24, 2024 | WENPIN TANG AND HANYANG ZHAO
This paper introduces Contractive Diffusion Probabilistic Models (CDPMs), a novel class of diffusion probabilistic models (DPMs) that improve robustness to score matching and discretization errors by enforcing contraction in the backward sampling process. DPMs are generative models that rely on time-reversed diffusion processes and score matching. While existing DPMs assume score matching is close to perfect, CDPMs address this by ensuring the backward process is contractive, which helps to limit the propagation of errors during sampling. The key idea is that a contractive backward process restricts the spread of score matching and discretization errors, leading to more stable and accurate sampling. CDPMs are designed by incorporating a contraction condition into the backward process, which is achieved by ensuring that the process tends to contract over time. This is done by modifying the parameters of the diffusion process to ensure that the backward process is contractive. The paper shows that CDPMs can be implemented using pre-trained DPMs through a simple transformation, without requiring retraining. The proposed CDPMs are evaluated on various datasets, including synthetic 1D examples, Swiss Roll, MNIST, CIFAR-10, and AFHQ, demonstrating superior performance compared to existing SDE-based DPMs. Theoretical analysis shows that CDPMs provide tighter Wasserstein bounds between the target data distribution and the generated samples, which aligns with the Fréchet Inception Distance (FID) metric used in image synthesis. The paper also discusses the connection between CDPMs and Variance Exploding (VE) SDEs, showing that VE can be viewed as an implicit CDPM at earlier denoising steps. Additionally, CDPMs are shown to be compatible with existing score-based generative models, allowing for the use of pre-trained weights without retraining. The results demonstrate that CDPMs outperform other SDE-based DPMs in terms of both theoretical guarantees and empirical performance, particularly on image synthesis tasks. The paper concludes that CDPMs offer a promising approach to improving the robustness and efficiency of diffusion models, with potential for further research in this area.This paper introduces Contractive Diffusion Probabilistic Models (CDPMs), a novel class of diffusion probabilistic models (DPMs) that improve robustness to score matching and discretization errors by enforcing contraction in the backward sampling process. DPMs are generative models that rely on time-reversed diffusion processes and score matching. While existing DPMs assume score matching is close to perfect, CDPMs address this by ensuring the backward process is contractive, which helps to limit the propagation of errors during sampling. The key idea is that a contractive backward process restricts the spread of score matching and discretization errors, leading to more stable and accurate sampling. CDPMs are designed by incorporating a contraction condition into the backward process, which is achieved by ensuring that the process tends to contract over time. This is done by modifying the parameters of the diffusion process to ensure that the backward process is contractive. The paper shows that CDPMs can be implemented using pre-trained DPMs through a simple transformation, without requiring retraining. The proposed CDPMs are evaluated on various datasets, including synthetic 1D examples, Swiss Roll, MNIST, CIFAR-10, and AFHQ, demonstrating superior performance compared to existing SDE-based DPMs. Theoretical analysis shows that CDPMs provide tighter Wasserstein bounds between the target data distribution and the generated samples, which aligns with the Fréchet Inception Distance (FID) metric used in image synthesis. The paper also discusses the connection between CDPMs and Variance Exploding (VE) SDEs, showing that VE can be viewed as an implicit CDPM at earlier denoising steps. Additionally, CDPMs are shown to be compatible with existing score-based generative models, allowing for the use of pre-trained weights without retraining. The results demonstrate that CDPMs outperform other SDE-based DPMs in terms of both theoretical guarantees and empirical performance, particularly on image synthesis tasks. The paper concludes that CDPMs offer a promising approach to improving the robustness and efficiency of diffusion models, with potential for further research in this area.
Reach us at info@study.space
[slides] Contractive Diffusion Probabilistic Models | StudySpace