Is Mamba Effective for Time Series Forecasting?

Is Mamba Effective for Time Series Forecasting?

27 Apr 2024 | Zihan Wang, Fanheng Kong, Shi Feng*, Ming Wang, Xiaocui Yang, Han Zhao, Daling Wang and Yifei Zhang
This paper investigates the effectiveness of Mamba, a selective state space model, for time series forecasting (TSF). The authors propose a Mamba-based model named Simple-Mamba (S-Mamba) for TSF tasks. S-Mamba tokenizes time points of each variate via a linear layer, uses a bidirectional Mamba layer to extract inter-variate correlations, and a Feed-Forward Network (FFN) to learn temporal dependencies. The model then generates forecast outcomes through a linear mapping layer. Experiments on thirteen public datasets show that S-Mamba maintains low computational overhead and achieves leading performance. The results indicate that Mamba can effectively reduce parameter size and improve model inference efficiency while achieving similar or better performance than Transformers. Mamba also demonstrates strong performance in capturing global dependencies and has a better sense of position relationships. The study further explores Mamba's potential in TSF tasks, showing that it can outperform Transformers in terms of computational efficiency and forecast accuracy. The authors conclude that Mamba has significant potential to outperform Transformers in TSF tasks.This paper investigates the effectiveness of Mamba, a selective state space model, for time series forecasting (TSF). The authors propose a Mamba-based model named Simple-Mamba (S-Mamba) for TSF tasks. S-Mamba tokenizes time points of each variate via a linear layer, uses a bidirectional Mamba layer to extract inter-variate correlations, and a Feed-Forward Network (FFN) to learn temporal dependencies. The model then generates forecast outcomes through a linear mapping layer. Experiments on thirteen public datasets show that S-Mamba maintains low computational overhead and achieves leading performance. The results indicate that Mamba can effectively reduce parameter size and improve model inference efficiency while achieving similar or better performance than Transformers. Mamba also demonstrates strong performance in capturing global dependencies and has a better sense of position relationships. The study further explores Mamba's potential in TSF tasks, showing that it can outperform Transformers in terms of computational efficiency and forecast accuracy. The authors conclude that Mamba has significant potential to outperform Transformers in TSF tasks.
Reach us at info@study.space