TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling

TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling

2024 | Jiaxiang Dong, Haixu Wu, Yuxuan Wang, Yunzhong Qiu, Li Zhang, Jianmin Wang, Mingsheng Long
TimeSiam is a self-supervised pre-training framework for time series modeling that emphasizes temporal correlation. Unlike previous methods that rely on masked modeling or contrastive learning, TimeSiam uses Siamese networks to capture correlations between temporally distant subseries. It introduces a simple data augmentation method, such as masking, to generate diverse subseries and learn time-dependent representations through past-to-current reconstruction. Learnable lineage embeddings are also introduced to distinguish temporal distances between sampled series and enhance the learning of diverse temporal correlations. TimeSiam consistently outperforms existing pre-training baselines across 13 standard benchmarks in both in- and cross-domain scenarios. The framework is designed to capture intrinsic temporal correlations by sampling pairs of subseries from the same time series at different timestamps. It leverages Siamese encoders to model correlations between temporally distant subseries and uses a decoder with cross-attention and self-attention mechanisms to ensure accurate reconstruction of masked subseries. TimeSiam also incorporates learnable lineage embeddings to enhance the model's capacity to learn diverse time-dependent representations. The framework is effective in both time series forecasting and classification tasks, achieving state-of-the-art performance across various downstream tasks. The results demonstrate that TimeSiam is a simple yet effective pre-training method that can be applied to a wide range of time series analysis tasks.TimeSiam is a self-supervised pre-training framework for time series modeling that emphasizes temporal correlation. Unlike previous methods that rely on masked modeling or contrastive learning, TimeSiam uses Siamese networks to capture correlations between temporally distant subseries. It introduces a simple data augmentation method, such as masking, to generate diverse subseries and learn time-dependent representations through past-to-current reconstruction. Learnable lineage embeddings are also introduced to distinguish temporal distances between sampled series and enhance the learning of diverse temporal correlations. TimeSiam consistently outperforms existing pre-training baselines across 13 standard benchmarks in both in- and cross-domain scenarios. The framework is designed to capture intrinsic temporal correlations by sampling pairs of subseries from the same time series at different timestamps. It leverages Siamese encoders to model correlations between temporally distant subseries and uses a decoder with cross-attention and self-attention mechanisms to ensure accurate reconstruction of masked subseries. TimeSiam also incorporates learnable lineage embeddings to enhance the model's capacity to learn diverse time-dependent representations. The framework is effective in both time series forecasting and classification tasks, achieving state-of-the-art performance across various downstream tasks. The results demonstrate that TimeSiam is a simple yet effective pre-training method that can be applied to a wide range of time series analysis tasks.
Reach us at info@study.space