2024 | Lu Han; Xu-Yang Chen; Han-Jia Ye; De-Chuan Zhan
SOFTS: Efficient Multivariate Time Series Forecasting with Series-Core Fusion
Multivariate time series forecasting is crucial in fields like finance, traffic, energy, and healthcare. Recent studies highlight the benefits of channel independence but neglect channel correlations, limiting model improvements. Existing methods use attention or mixer mechanisms to capture correlations but often introduce complexity or rely heavily on correlation, which is ineffective under distribution drifts. This paper presents SOFTS, an efficient MLP-based model with a novel STAR module. Unlike traditional methods that use distributed structures, SOFTS employs a centralized strategy to improve efficiency and reduce reliance on channel quality. It aggregates all series to form a global core representation, which is then dispatched and fused with individual series representations to facilitate channel interactions. SOFTS achieves superior performance with linear complexity and demonstrates the broad applicability of the STAR module across different forecasting models. The model is efficient, scalable, and robust, with empirical results showing better performance than existing state-of-the-art methods. SOFTS can handle large numbers of channels or time steps, and the STAR module is a universal component that can replace attention in many models. The paper also includes detailed experiments and analysis, showing that SOFTS outperforms other methods in various datasets and scenarios. The model's efficiency and effectiveness are validated on multiple transformer-based time series forecasters. The contributions include the introduction of SOFTS, the STAR module, and extensive experiments demonstrating the model's effectiveness and scalability.SOFTS: Efficient Multivariate Time Series Forecasting with Series-Core Fusion
Multivariate time series forecasting is crucial in fields like finance, traffic, energy, and healthcare. Recent studies highlight the benefits of channel independence but neglect channel correlations, limiting model improvements. Existing methods use attention or mixer mechanisms to capture correlations but often introduce complexity or rely heavily on correlation, which is ineffective under distribution drifts. This paper presents SOFTS, an efficient MLP-based model with a novel STAR module. Unlike traditional methods that use distributed structures, SOFTS employs a centralized strategy to improve efficiency and reduce reliance on channel quality. It aggregates all series to form a global core representation, which is then dispatched and fused with individual series representations to facilitate channel interactions. SOFTS achieves superior performance with linear complexity and demonstrates the broad applicability of the STAR module across different forecasting models. The model is efficient, scalable, and robust, with empirical results showing better performance than existing state-of-the-art methods. SOFTS can handle large numbers of channels or time steps, and the STAR module is a universal component that can replace attention in many models. The paper also includes detailed experiments and analysis, showing that SOFTS outperforms other methods in various datasets and scenarios. The model's efficiency and effectiveness are validated on multiple transformer-based time series forecasters. The contributions include the introduction of SOFTS, the STAR module, and extensive experiments demonstrating the model's effectiveness and scalability.