Bigger is not Always Better: Scaling Properties of Latent Diffusion Models

Bigger is not Always Better: Scaling Properties of Latent Diffusion Models

1 Apr 2024 | Kangfu Mei*, Zhengzhong Tu, Mauricio Delbracio, Hossein Talebi, Vishal M. Patel, Peyman Milanfar
The paper "Bigger is Not Always Better: Scaling Properties of Latent Diffusion Models" by Kangfu Mei et al. explores the scaling properties of latent diffusion models (LDMs) with a focus on their sampling efficiency. The authors investigate how model size influences sampling efficiency across varying sampling steps, using established text-to-image diffusion models. Their findings reveal that smaller models often outperform larger ones in generating high-quality results within a given inference budget. This trend is observed across different diffusion samplers, downstream tasks, and post-distilled models. The study also demonstrates that diffusion distillation does not alter these scaling trends. The results suggest new strategies for enhancing LDMs' generative capabilities within limited inference budgets, highlighting the importance of optimizing model size for efficient sampling.The paper "Bigger is Not Always Better: Scaling Properties of Latent Diffusion Models" by Kangfu Mei et al. explores the scaling properties of latent diffusion models (LDMs) with a focus on their sampling efficiency. The authors investigate how model size influences sampling efficiency across varying sampling steps, using established text-to-image diffusion models. Their findings reveal that smaller models often outperform larger ones in generating high-quality results within a given inference budget. This trend is observed across different diffusion samplers, downstream tasks, and post-distilled models. The study also demonstrates that diffusion distillation does not alter these scaling trends. The results suggest new strategies for enhancing LDMs' generative capabilities within limited inference budgets, highlighting the importance of optimizing model size for efficient sampling.
Reach us at info@study.space