TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables

TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables

2024 | Yuxuan Wang, Haixu Wu, Jiaxiang Dong, Guo Qin, Haoran Zhang, Yong Liu, Yunzhong Qiu, Jianmin Wang, Mingsheng Long
TimeXer is a novel approach for time series forecasting with exogenous variables, enhancing the forecasting of endogenous variables by integrating external information. The model leverages patch-wise and variate-wise attention mechanisms to capture temporal dependencies and multivariate correlations. It introduces learnable global tokens to bridge causal information from exogenous series to endogenous temporal patches. TimeXer achieves state-of-the-art performance on twelve real-world forecasting benchmarks, demonstrating strong generalization and scalability. It effectively handles irregular and heterogeneous exogenous series, including missing values, temporal misalignment, and frequency mismatch. The model is designed to work with the canonical Transformer without architectural modifications, enabling it to simultaneously model exogenous and endogenous variables. TimeXer's structure includes endogenous and exogenous embeddings, self-attention, and cross-attention mechanisms to capture both intra-endogenous temporal dependencies and exogenous-to-endogenous correlations. The model is evaluated on short-term and long-term forecasting tasks, showing superior performance compared to existing baselines. It also demonstrates robustness in scenarios with missing values and scalability on large-scale time series data. TimeXer's design allows it to adapt to various forecasting scenarios, including those with complex real-world challenges. The model's efficiency is validated through training time and memory footprint comparisons with other baselines. Overall, TimeXer provides a flexible and effective solution for time series forecasting with exogenous variables.TimeXer is a novel approach for time series forecasting with exogenous variables, enhancing the forecasting of endogenous variables by integrating external information. The model leverages patch-wise and variate-wise attention mechanisms to capture temporal dependencies and multivariate correlations. It introduces learnable global tokens to bridge causal information from exogenous series to endogenous temporal patches. TimeXer achieves state-of-the-art performance on twelve real-world forecasting benchmarks, demonstrating strong generalization and scalability. It effectively handles irregular and heterogeneous exogenous series, including missing values, temporal misalignment, and frequency mismatch. The model is designed to work with the canonical Transformer without architectural modifications, enabling it to simultaneously model exogenous and endogenous variables. TimeXer's structure includes endogenous and exogenous embeddings, self-attention, and cross-attention mechanisms to capture both intra-endogenous temporal dependencies and exogenous-to-endogenous correlations. The model is evaluated on short-term and long-term forecasting tasks, showing superior performance compared to existing baselines. It also demonstrates robustness in scenarios with missing values and scalability on large-scale time series data. TimeXer's design allows it to adapt to various forecasting scenarios, including those with complex real-world challenges. The model's efficiency is validated through training time and memory footprint comparisons with other baselines. Overall, TimeXer provides a flexible and effective solution for time series forecasting with exogenous variables.
Reach us at info@study.space
[slides] TimeXer%3A Empowering Transformers for Time Series Forecasting with Exogenous Variables | StudySpace