Time Series Forecasting with LLMs: Understanding and Enhancing Model Capabilities

Time Series Forecasting with LLMs: Understanding and Enhancing Model Capabilities

10 Aug 2024 | Hua Tang, Chong Zhang, Mingyu Jin, Qinkai Yu, Zhenting Wang, Xiaobo Jin, Yongfeng Zhang, Mengnan Du
This paper explores the application of large language models (LLMs) in time series forecasting, focusing on their performance and preferences under zero-shot settings. The authors investigate how LLMs handle different types of time series data, particularly those with clear patterns and trends versus those lacking periodicity. They find that LLMs perform well with datasets that have higher trend and seasonal strengths, but struggle with multiple periods. To enhance LLMs' performance, the authors propose two techniques: incorporating external human knowledge and converting numerical sequences into natural language. These methods significantly improve the accuracy of LLMs in time series forecasting. The study contributes to a better understanding of LLMs' advantages and limitations in this context and provides practical strategies for leveraging their capabilities.This paper explores the application of large language models (LLMs) in time series forecasting, focusing on their performance and preferences under zero-shot settings. The authors investigate how LLMs handle different types of time series data, particularly those with clear patterns and trends versus those lacking periodicity. They find that LLMs perform well with datasets that have higher trend and seasonal strengths, but struggle with multiple periods. To enhance LLMs' performance, the authors propose two techniques: incorporating external human knowledge and converting numerical sequences into natural language. These methods significantly improve the accuracy of LLMs in time series forecasting. The study contributes to a better understanding of LLMs' advantages and limitations in this context and provides practical strategies for leveraging their capabilities.
Reach us at info@study.space