This paper introduces Trajectory Consistency Distillation (TCD), an improved latent consistency distillation method that enhances the performance of Latent Consistency Models (LCMs). TCD incorporates two key components: the Trajectory Consistency Function (TCF) for training and Strategic Stochastic Sampling (SSS) for inference. TCF reduces parameterisation and distillation errors by expanding the boundary conditions of the consistency model and enabling seamless transitions along the trajectory governed by the Probability Flow ODE (PF ODE). It also allows the model to accurately trace the entire trajectory of the PF ODE in semi-linear form using an Exponential Integrator. SSS provides explicit control over stochasticity and suppresses accumulated discretisation and estimation errors through bijective traversal. Experiments show that TCD significantly improves image quality at low NFEs and produces more detailed results compared to the teacher model at high NFEs. TCD outperforms LCM in terms of both quality and speed, maintaining superior generative quality even at high NFEs. TCD is also capable of outperforming the teacher model (e.g., SDXL with DPMSolver++) when sampling with sufficient iterations (e.g., 20 NFEs). The paper also discusses related works, including Diffusion Models, Text-conditional Diffusion Models, Fast Sampling of DMs, and Consistency Models. It provides a detailed overview of the preliminary concepts, including Diffusion Models, Consistency Models, and Latent Consistency Models. The paper then presents the TCD method, including the Trajectory Consistency Function and Strategic Stochastic Sampling, and evaluates its performance through experiments. The results show that TCD significantly improves the generation quality of LCMs, especially in reducing errors arising during the parameterisation process. TCD is also capable of outperforming the teacher model when sampling with sufficient iterations. The paper concludes that TCD is an effective method for improving the performance of LCMs and achieving high-quality image generation.This paper introduces Trajectory Consistency Distillation (TCD), an improved latent consistency distillation method that enhances the performance of Latent Consistency Models (LCMs). TCD incorporates two key components: the Trajectory Consistency Function (TCF) for training and Strategic Stochastic Sampling (SSS) for inference. TCF reduces parameterisation and distillation errors by expanding the boundary conditions of the consistency model and enabling seamless transitions along the trajectory governed by the Probability Flow ODE (PF ODE). It also allows the model to accurately trace the entire trajectory of the PF ODE in semi-linear form using an Exponential Integrator. SSS provides explicit control over stochasticity and suppresses accumulated discretisation and estimation errors through bijective traversal. Experiments show that TCD significantly improves image quality at low NFEs and produces more detailed results compared to the teacher model at high NFEs. TCD outperforms LCM in terms of both quality and speed, maintaining superior generative quality even at high NFEs. TCD is also capable of outperforming the teacher model (e.g., SDXL with DPMSolver++) when sampling with sufficient iterations (e.g., 20 NFEs). The paper also discusses related works, including Diffusion Models, Text-conditional Diffusion Models, Fast Sampling of DMs, and Consistency Models. It provides a detailed overview of the preliminary concepts, including Diffusion Models, Consistency Models, and Latent Consistency Models. The paper then presents the TCD method, including the Trajectory Consistency Function and Strategic Stochastic Sampling, and evaluates its performance through experiments. The results show that TCD significantly improves the generation quality of LCMs, especially in reducing errors arising during the parameterisation process. TCD is also capable of outperforming the teacher model when sampling with sufficient iterations. The paper concludes that TCD is an effective method for improving the performance of LCMs and achieving high-quality image generation.