T-Stitch: Accelerating Sampling in Pre-Trained Diffusion Models with Trajectory Stitching

T-Stitch: Accelerating Sampling in Pre-Trained Diffusion Models with Trajectory Stitching

21 Feb 2024 | Zizheng Pan, Bohan Zhuang, De-An Huang, Weili Nie, Zhiding Yu, Chaowei Xiao, Jianfei Cai, Anima Anandkumar
The paper introduces T-Stitch, a technique to accelerate sampling in pre-trained diffusion models (DPMs) by leveraging smaller models at the initial stages of the sampling process. T-Stitch aims to improve sampling efficiency without significant degradation in generation quality. The key insight is that different DPMs can learn similar encodings under the same training data distribution, allowing smaller models to generate good global structures early in the sampling process. The method dynamically allocates computation between different DPM sizes, using a smaller model for the initial steps and switching to a larger model later. Extensive experiments demonstrate that T-Stitch is training-free, generally applicable to various architectures, and complementary to existing fast sampling techniques. It achieves flexible speed-quality trade-offs and improves prompt alignment in stylized models. The method is validated on different DPMs, including DiT, U-Net, and Stable Diffusion, showing consistent efficiency gains and maintaining or improving generation quality.The paper introduces T-Stitch, a technique to accelerate sampling in pre-trained diffusion models (DPMs) by leveraging smaller models at the initial stages of the sampling process. T-Stitch aims to improve sampling efficiency without significant degradation in generation quality. The key insight is that different DPMs can learn similar encodings under the same training data distribution, allowing smaller models to generate good global structures early in the sampling process. The method dynamically allocates computation between different DPM sizes, using a smaller model for the initial steps and switching to a larger model later. Extensive experiments demonstrate that T-Stitch is training-free, generally applicable to various architectures, and complementary to existing fast sampling techniques. It achieves flexible speed-quality trade-offs and improves prompt alignment in stylized models. The method is validated on different DPMs, including DiT, U-Net, and Stable Diffusion, showing consistent efficiency gains and maintaining or improving generation quality.
Reach us at info@study.space
Understanding T-Stitch%3A Accelerating Sampling in Pre-Trained Diffusion Models with Trajectory Stitching