Latent Diffusion Transformer for Probabilistic Time Series Forecasting

Latent Diffusion Transformer for Probabilistic Time Series Forecasting

2024 | Shibo Feng, Chunyan Miao, Zhong Zhang, Peilin Zhao
The paper introduces the Latent Diffusion Transformer (LDT), a novel framework for probabilistic time series forecasting, particularly for high-dimensional multivariate time series. The LDT approach aims to improve the expressiveness of each timestamp and make forecasting more manageable by condensing high-dimensional data into a latent space. The framework consists of two main components: a symmetric statistics-aware autoencoder and a diffusion-based conditional generator. The autoencoder dynamically updates global statistics during training to accurately reconstruct future timestamps, while the LDT generator uses a self-conditioning mechanism and a non-autoregressive transformer to efficiently generate realistic multivariate timestamp values in a continuous latent space. Extensive experiments on various real-world datasets demonstrate that the LDT model outperforms existing state-of-the-art methods in terms of both accuracy and efficiency. Key contributions include the introduction of the LDT model, a practical structure featuring a unique self-conditioning mechanism, and superior performance on high-dimensional multivariate time series forecasting tasks.The paper introduces the Latent Diffusion Transformer (LDT), a novel framework for probabilistic time series forecasting, particularly for high-dimensional multivariate time series. The LDT approach aims to improve the expressiveness of each timestamp and make forecasting more manageable by condensing high-dimensional data into a latent space. The framework consists of two main components: a symmetric statistics-aware autoencoder and a diffusion-based conditional generator. The autoencoder dynamically updates global statistics during training to accurately reconstruct future timestamps, while the LDT generator uses a self-conditioning mechanism and a non-autoregressive transformer to efficiently generate realistic multivariate timestamp values in a continuous latent space. Extensive experiments on various real-world datasets demonstrate that the LDT model outperforms existing state-of-the-art methods in terms of both accuracy and efficiency. Key contributions include the introduction of the LDT model, a practical structure featuring a unique self-conditioning mechanism, and superior performance on high-dimensional multivariate time series forecasting tasks.
Reach us at info@study.space