FLOW MATCHING FOR GENERATIVE MODELING

FLOW MATCHING FOR GENERATIVE MODELING

8 Feb 2023 | Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, Matt Le
This paper introduces Flow Matching (FM), a simulation-free approach for training Continuous Normalizing Flows (CNFs). FM allows training of CNFs at unprecedented scale by regressing vector fields of fixed conditional probability paths. FM is compatible with a general family of Gaussian probability paths, including diffusion paths, and offers a more robust and stable alternative for training diffusion models. Additionally, FM opens the door to training CNFs with non-diffusion probability paths, such as those based on Optimal Transport (OT) displacement interpolation. These paths are more efficient, faster to train and sample, and result in better generalization. Training CNFs using FM on ImageNet leads to consistently better performance than alternative diffusion-based methods in terms of both likelihood and sample quality, and allows fast and reliable sample generation using off-the-shelf numerical ODE solvers. The paper also shows that FM can be used for conditional sampling and image super-resolution. The key contributions include the introduction of FM, the development of Conditional Flow Matching (CFM), and the demonstration of FM's effectiveness on ImageNet. The experiments show that FM outperforms existing methods in terms of sample quality, training efficiency, and computational cost. The paper also discusses the theoretical foundations of FM, including the connection between conditional and marginal vector fields, and the equivalence of FM and CFM objectives. The results show that FM provides a more efficient and effective approach to training CNFs, with potential applications in a wide range of generative modeling tasks.This paper introduces Flow Matching (FM), a simulation-free approach for training Continuous Normalizing Flows (CNFs). FM allows training of CNFs at unprecedented scale by regressing vector fields of fixed conditional probability paths. FM is compatible with a general family of Gaussian probability paths, including diffusion paths, and offers a more robust and stable alternative for training diffusion models. Additionally, FM opens the door to training CNFs with non-diffusion probability paths, such as those based on Optimal Transport (OT) displacement interpolation. These paths are more efficient, faster to train and sample, and result in better generalization. Training CNFs using FM on ImageNet leads to consistently better performance than alternative diffusion-based methods in terms of both likelihood and sample quality, and allows fast and reliable sample generation using off-the-shelf numerical ODE solvers. The paper also shows that FM can be used for conditional sampling and image super-resolution. The key contributions include the introduction of FM, the development of Conditional Flow Matching (CFM), and the demonstration of FM's effectiveness on ImageNet. The experiments show that FM outperforms existing methods in terms of sample quality, training efficiency, and computational cost. The paper also discusses the theoretical foundations of FM, including the connection between conditional and marginal vector fields, and the equivalence of FM and CFM objectives. The results show that FM provides a more efficient and effective approach to training CNFs, with potential applications in a wide range of generative modeling tasks.
Reach us at info@study.space
Understanding Flow Matching for Generative Modeling