Fredformer: Frequency Debiased Transformer for Time Series Forecasting

Fredformer: Frequency Debiased Transformer for Time Series Forecasting

August 25-29, 2024 | Xihao Piao, Zheng Chen, Taichi Murayama, Yasuko Matsubara, and Yasushi Sakurai
Fredformer is a Frequency Debiased Transformer designed for accurate time series forecasting. The Transformer model has shown strong performance in time series forecasting but can suffer from frequency bias, where it disproportionately focuses on low-frequency features, neglecting high-frequency ones. This bias can hinder accurate forecasting by overlooking important high-frequency data features. The paper investigates this bias and proposes Fredformer to mitigate it by learning features equally across different frequency bands. The model uses frequency decomposition and incorporates frequency normalization and local frequency independent learning to ensure balanced attention across all frequency components. Extensive experiments show that Fredformer outperforms other baselines on various real-world time series datasets. A lightweight variant of Fredformer with attention matrix approximation achieves comparable performance with fewer parameters and lower computational costs. The model's effectiveness is demonstrated through case studies and ablation experiments, showing that it reduces frequency bias and improves forecasting accuracy. The paper also discusses the practical deployment of Fredformer, highlighting its efficiency and effectiveness in real-world scenarios.Fredformer is a Frequency Debiased Transformer designed for accurate time series forecasting. The Transformer model has shown strong performance in time series forecasting but can suffer from frequency bias, where it disproportionately focuses on low-frequency features, neglecting high-frequency ones. This bias can hinder accurate forecasting by overlooking important high-frequency data features. The paper investigates this bias and proposes Fredformer to mitigate it by learning features equally across different frequency bands. The model uses frequency decomposition and incorporates frequency normalization and local frequency independent learning to ensure balanced attention across all frequency components. Extensive experiments show that Fredformer outperforms other baselines on various real-world time series datasets. A lightweight variant of Fredformer with attention matrix approximation achieves comparable performance with fewer parameters and lower computational costs. The model's effectiveness is demonstrated through case studies and ablation experiments, showing that it reduces frequency bias and improves forecasting accuracy. The paper also discusses the practical deployment of Fredformer, highlighting its efficiency and effectiveness in real-world scenarios.
Reach us at info@study.space
Understanding Fredformer%3A Frequency Debiased Transformer for Time Series Forecasting