RWKV-TS is a novel RNN-based model designed for time series tasks, offering improved performance and efficiency compared to traditional RNNs and other architectures like Transformers and CNNs. It addresses limitations of traditional RNNs, such as the vanishing gradient problem, low computational efficiency, and error accumulation during sequential prediction. RWKV-TS features a novel RNN architecture with O(L) time and memory complexity, enhanced long-term sequence information capture, and high computational efficiency. It demonstrates competitive performance against state-of-the-art models in various time series tasks, including long-term forecasting, short-term forecasting, imputation, anomaly detection, and classification, while achieving lower latency and memory usage. The model's encoder-only architecture allows for efficient processing and avoids error accumulation. RWKV-TS has been evaluated on multiple benchmark datasets, showing strong performance in tasks such as long-term forecasting, short-term forecasting, and anomaly detection. It outperforms many existing models in terms of efficiency and effectiveness, making it a promising approach for time series analysis. The model's success highlights the continued relevance of RNNs in time series tasks, challenging the notion that they are obsolete. RWKV-TS is implemented with a linear RNN design, enabling parallel computation and efficient scaling. It has been shown to maintain strong information capture capabilities even for long sequences, demonstrating the potential of linear RNN models in time series analysis. The model's performance and efficiency make it suitable for deployment on devices with limited computational and memory resources. Overall, RWKV-TS provides a competitive alternative to traditional RNNs and other architectures, offering a balance between performance and efficiency in time series tasks.RWKV-TS is a novel RNN-based model designed for time series tasks, offering improved performance and efficiency compared to traditional RNNs and other architectures like Transformers and CNNs. It addresses limitations of traditional RNNs, such as the vanishing gradient problem, low computational efficiency, and error accumulation during sequential prediction. RWKV-TS features a novel RNN architecture with O(L) time and memory complexity, enhanced long-term sequence information capture, and high computational efficiency. It demonstrates competitive performance against state-of-the-art models in various time series tasks, including long-term forecasting, short-term forecasting, imputation, anomaly detection, and classification, while achieving lower latency and memory usage. The model's encoder-only architecture allows for efficient processing and avoids error accumulation. RWKV-TS has been evaluated on multiple benchmark datasets, showing strong performance in tasks such as long-term forecasting, short-term forecasting, and anomaly detection. It outperforms many existing models in terms of efficiency and effectiveness, making it a promising approach for time series analysis. The model's success highlights the continued relevance of RNNs in time series tasks, challenging the notion that they are obsolete. RWKV-TS is implemented with a linear RNN design, enabling parallel computation and efficient scaling. It has been shown to maintain strong information capture capabilities even for long sequences, demonstrating the potential of linear RNN models in time series analysis. The model's performance and efficiency make it suitable for deployment on devices with limited computational and memory resources. Overall, RWKV-TS provides a competitive alternative to traditional RNNs and other architectures, offering a balance between performance and efficiency in time series tasks.