FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting

FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting

Baltimore, Maryland, USA, PMLR 162, 2022 | Tian Zhou * 1 Ziqing Ma * 1 Qingsong Wen 1 Xue Wang 1 Liang Sun 1 Rong Jin 1
FEDformer is a frequency-enhanced decomposed transformer designed for long-term time series forecasting. It combines seasonal-trend decomposition with transformer-based methods to better capture global properties of time series. The model uses Fourier and wavelet transforms to enhance the transformer's ability to capture global patterns, reducing computational complexity to linear in sequence length. Empirical studies on six benchmark datasets show that FEDformer reduces prediction error by 14.8% for multivariate and 22.6% for univariate time series compared to state-of-the-art methods. The model's effectiveness is supported by both theoretical analysis and extensive experiments. FEDformer achieves linear computational complexity by leveraging random selection of frequency components, making it more efficient than standard transformers. The model also incorporates a mixture of experts mechanism for seasonal-trend decomposition, improving forecasting performance. Experiments demonstrate that FEDformer outperforms other models in terms of accuracy and efficiency, with a strong ability to maintain the distribution of time series data. The model's design addresses the limitations of traditional transformers in capturing global trends and improves forecasting accuracy for long-term sequences.FEDformer is a frequency-enhanced decomposed transformer designed for long-term time series forecasting. It combines seasonal-trend decomposition with transformer-based methods to better capture global properties of time series. The model uses Fourier and wavelet transforms to enhance the transformer's ability to capture global patterns, reducing computational complexity to linear in sequence length. Empirical studies on six benchmark datasets show that FEDformer reduces prediction error by 14.8% for multivariate and 22.6% for univariate time series compared to state-of-the-art methods. The model's effectiveness is supported by both theoretical analysis and extensive experiments. FEDformer achieves linear computational complexity by leveraging random selection of frequency components, making it more efficient than standard transformers. The model also incorporates a mixture of experts mechanism for seasonal-trend decomposition, improving forecasting performance. Experiments demonstrate that FEDformer outperforms other models in terms of accuracy and efficiency, with a strong ability to maintain the distribution of time series data. The model's design addresses the limitations of traditional transformers in capturing global trends and improves forecasting accuracy for long-term sequences.
Reach us at info@study.space
[slides] FEDformer%3A Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting | StudySpace