Baltimore, Maryland, USA, PMLR 162, 2022 | Tian Zhou * 1 Ziqing Ma * 1 Qingsong Wen 1 Xue Wang 1 Liang Sun 1 Rong Jin 1
The paper introduces FEDformer, a frequency-enhanced decomposed Transformer for long-term series forecasting. It addresses the limitations of traditional Transformers, which are computationally expensive and unable to capture the global view of time series. FEDformer combines seasonal-trend decomposition with Transformers, where the decomposition method captures the overall trend, while the Transformer captures detailed structures. To enhance performance, FEDformer exploits the sparse representation of time series in Fourier transform and develops Fourier-enhanced blocks to improve the model's efficiency. The proposed method achieves linear complexity in sequence length and reduces prediction errors by 14.8% and 22.6% for multivariate and univariate time series, respectively, compared to state-of-the-art methods. Extensive experiments on six benchmark datasets demonstrate the effectiveness of FEDformer.The paper introduces FEDformer, a frequency-enhanced decomposed Transformer for long-term series forecasting. It addresses the limitations of traditional Transformers, which are computationally expensive and unable to capture the global view of time series. FEDformer combines seasonal-trend decomposition with Transformers, where the decomposition method captures the overall trend, while the Transformer captures detailed structures. To enhance performance, FEDformer exploits the sparse representation of time series in Fourier transform and develops Fourier-enhanced blocks to improve the model's efficiency. The proposed method achieves linear complexity in sequence length and reduces prediction errors by 14.8% and 22.6% for multivariate and univariate time series, respectively, compared to state-of-the-art methods. Extensive experiments on six benchmark datasets demonstrate the effectiveness of FEDformer.