FLOW MATCHING FOR GENERATIVE MODELING

FLOW MATCHING FOR GENERATIVE MODELING

8 Feb 2023 | Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, Matt Le
The paper introduces a new paradigm for generative modeling based on Continuous Normalizing Flows (CNFs), allowing for the training of CNFs at unprecedented scale. The key contribution is the introduction of Flow Matching (FM), a simulation-free approach that regresses vector fields of fixed conditional probability paths. FM is compatible with a general family of Gaussian probability paths, including existing diffusion paths as specific instances. Using FM with diffusion paths results in more robust and stable training, and it also enables the use of other non-diffusion probability paths, such as Optimal Transport (OT) displacement interpolation. OT paths are more efficient, provide faster training and sampling, and result in better generalization. Empirical results on ImageNet show that FM-trained CNFs achieve better performance in likelihood estimation and sample quality compared to alternative diffusion-based methods, with faster and more reliable sample generation using off-the-shelf numerical ODE solvers. The paper also discusses the construction of conditional probability paths and vector fields, and provides theoretical guarantees for the effectiveness of FM.The paper introduces a new paradigm for generative modeling based on Continuous Normalizing Flows (CNFs), allowing for the training of CNFs at unprecedented scale. The key contribution is the introduction of Flow Matching (FM), a simulation-free approach that regresses vector fields of fixed conditional probability paths. FM is compatible with a general family of Gaussian probability paths, including existing diffusion paths as specific instances. Using FM with diffusion paths results in more robust and stable training, and it also enables the use of other non-diffusion probability paths, such as Optimal Transport (OT) displacement interpolation. OT paths are more efficient, provide faster training and sampling, and result in better generalization. Empirical results on ImageNet show that FM-trained CNFs achieve better performance in likelihood estimation and sample quality compared to alternative diffusion-based methods, with faster and more reliable sample generation using off-the-shelf numerical ODE solvers. The paper also discusses the construction of conditional probability paths and vector fields, and provides theoretical guarantees for the effectiveness of FM.
Reach us at info@study.space
[slides] Flow Matching for Generative Modeling | StudySpace