The paper introduces RWKV-TS, an efficient RNN-based model designed to address the limitations of traditional RNNs in time series tasks. Traditional RNNs, such as LSTM and GRU, have historically been prominent but have seen a decline in performance due to issues like the vanishing/exploding gradient problem, lack of parallel computation, and error accumulation. RWKV-TS is characterized by three key features: (i) a novel RNN architecture with $O(L)$ time complexity and memory usage, (ii) enhanced ability to capture long-term sequence information, and (iii) high computational efficiency with effective scaling capabilities. Extensive experiments demonstrate that RWKV-TS achieves competitive performance compared to state-of-the-art Transformer-based and CNN-based models, while also showing reduced latency and memory utilization. The model's success highlights the potential of RNNs in time series tasks and encourages further research in this area. The code for RWKV-TS is available at: https://github.com/howard-hou/RWKV-TS.The paper introduces RWKV-TS, an efficient RNN-based model designed to address the limitations of traditional RNNs in time series tasks. Traditional RNNs, such as LSTM and GRU, have historically been prominent but have seen a decline in performance due to issues like the vanishing/exploding gradient problem, lack of parallel computation, and error accumulation. RWKV-TS is characterized by three key features: (i) a novel RNN architecture with $O(L)$ time complexity and memory usage, (ii) enhanced ability to capture long-term sequence information, and (iii) high computational efficiency with effective scaling capabilities. Extensive experiments demonstrate that RWKV-TS achieves competitive performance compared to state-of-the-art Transformer-based and CNN-based models, while also showing reduced latency and memory utilization. The model's success highlights the potential of RNNs in time series tasks and encourages further research in this area. The code for RWKV-TS is available at: https://github.com/howard-hou/RWKV-TS.