2024 | Xu Zheng, Tianchun Wang, Wei Cheng, Aitian Ma, Haifeng Chen, Mo Sha, Dongsheng Luo
This paper proposes AutoTCL, a parametric augmentation framework for time series contrastive learning. The authors address the challenge of selecting effective data augmentations for time series data, which is difficult due to the complexity and variability of time series. They introduce a factorization-based approach that separates the informative part of a time series from the task-irrelevant part, enabling the generation of meaningful views for contrastive learning. AutoTCL uses a parametric neural network to learn to factorize an instance into these two components and applies a lossless transformation to preserve the semantics of the original data. The framework is encoder-agnostic, allowing it to be seamlessly integrated with different backbone encoders.
Experiments on univariate forecasting tasks show that AutoTCL achieves a 6.5% reduction in MSE and 4.7% in MAE compared to leading baselines. In classification tasks, AutoTCL achieves a 1.2% increase in average accuracy. The proposed method is also effective in multivariate forecasting, achieving a 2.9% reduction in MSE and 1.2% in MAE. The framework is evaluated on benchmark time series forecasting datasets and demonstrates strong performance across various tasks. The authors also conduct ablation studies to analyze the effectiveness of different components of the framework, showing that the factorization and transformation components are crucial for achieving good performance. AutoTCL is shown to outperform baselines in both forecasting and classification tasks, demonstrating the effectiveness of the proposed approach. The method is also compared with other contrastive learning frameworks, showing its versatility and adaptability. The results indicate that AutoTCL is a promising approach for time series contrastive learning, with the potential to be applied in various domains.This paper proposes AutoTCL, a parametric augmentation framework for time series contrastive learning. The authors address the challenge of selecting effective data augmentations for time series data, which is difficult due to the complexity and variability of time series. They introduce a factorization-based approach that separates the informative part of a time series from the task-irrelevant part, enabling the generation of meaningful views for contrastive learning. AutoTCL uses a parametric neural network to learn to factorize an instance into these two components and applies a lossless transformation to preserve the semantics of the original data. The framework is encoder-agnostic, allowing it to be seamlessly integrated with different backbone encoders.
Experiments on univariate forecasting tasks show that AutoTCL achieves a 6.5% reduction in MSE and 4.7% in MAE compared to leading baselines. In classification tasks, AutoTCL achieves a 1.2% increase in average accuracy. The proposed method is also effective in multivariate forecasting, achieving a 2.9% reduction in MSE and 1.2% in MAE. The framework is evaluated on benchmark time series forecasting datasets and demonstrates strong performance across various tasks. The authors also conduct ablation studies to analyze the effectiveness of different components of the framework, showing that the factorization and transformation components are crucial for achieving good performance. AutoTCL is shown to outperform baselines in both forecasting and classification tasks, demonstrating the effectiveness of the proposed approach. The method is also compared with other contrastive learning frameworks, showing its versatility and adaptability. The results indicate that AutoTCL is a promising approach for time series contrastive learning, with the potential to be applied in various domains.