Multi-Patch Prediction: Adapting LLMs for Time Series Representation Learning

Multi-Patch Prediction: Adapting LLMs for Time Series Representation Learning

10 Mar 2024 | Yuxuan Bian * 1 2 Xuan Ju * 1 Jiangtong Li * 2 3 Zhijian Xu 1 Dawei Cheng 2 4 Qiang Xu 1
The paper introduces *aLLM4TS*, a novel framework that adapts Large Language Models (LLMs) for time-series representation learning. The key innovation is reconfiguring time-series forecasting as a self-supervised, multi-patch prediction task, which effectively captures temporal dynamics in patch representations compared to traditional methods. The framework consists of two stages: (i) causal continual pre-training on various time-series datasets, focusing on next-patch prediction, and (ii) fine-tuning for multi-patch prediction in specific time-series contexts. A distinctive feature is the patch-wise decoding layer, which decodes each patch independently into temporal sequences, enhancing the model's proficiency in handling temporal patch-based representations. *aLLM4TS* demonstrates superior performance in various downstream tasks, showcasing its effectiveness in deriving temporal representations with enhanced transferability and marking a significant advancement in adapting LLMs for time-series analysis.The paper introduces *aLLM4TS*, a novel framework that adapts Large Language Models (LLMs) for time-series representation learning. The key innovation is reconfiguring time-series forecasting as a self-supervised, multi-patch prediction task, which effectively captures temporal dynamics in patch representations compared to traditional methods. The framework consists of two stages: (i) causal continual pre-training on various time-series datasets, focusing on next-patch prediction, and (ii) fine-tuning for multi-patch prediction in specific time-series contexts. A distinctive feature is the patch-wise decoding layer, which decodes each patch independently into temporal sequences, enhancing the model's proficiency in handling temporal patch-based representations. *aLLM4TS* demonstrates superior performance in various downstream tasks, showcasing its effectiveness in deriving temporal representations with enhanced transferability and marking a significant advancement in adapting LLMs for time-series analysis.
Reach us at info@study.space