Universal Time-Series Representation Learning: A Survey

Universal Time-Series Representation Learning: A Survey

August 2024 | PATARA TRIRAT, YOOJU SHIN, JUNHYEOK KANG, YOUNGEUN NAM, JIHYE NA, MIN-YOUNG BAE, JOEUN KIM, BYUNGHYUN KIM, and JAE-GIL LEE
This survey presents a comprehensive overview of universal time-series representation learning, focusing on three key design elements: training data, network architectures, and learning objectives. It reviews existing studies, discusses their insights into enhancing representation quality, and summarizes common experimental setups and datasets. The survey highlights the importance of deep learning in extracting hidden patterns from time-series data without manual feature engineering. It discusses various neural architectures, including recurrent neural networks (RNNs), long short-term memory (LSTM), gated recurrent units (GRU), convolutional neural networks (CNN), temporal convolutional networks (TCN), and graph neural networks (GNN), as well as attention-based models and neural differential equations. The survey also covers data-centric approaches, such as improving data quality through sample selection, time-series decomposition, and input space transformation, and increasing data quantity through data augmentation techniques like random and policy-based methods. It discusses self-supervised learning, contrastive learning, and the challenges of handling irregularly sampled time series. The survey concludes with a discussion of promising research directions and provides a guideline for future studies in time-series representation learning.This survey presents a comprehensive overview of universal time-series representation learning, focusing on three key design elements: training data, network architectures, and learning objectives. It reviews existing studies, discusses their insights into enhancing representation quality, and summarizes common experimental setups and datasets. The survey highlights the importance of deep learning in extracting hidden patterns from time-series data without manual feature engineering. It discusses various neural architectures, including recurrent neural networks (RNNs), long short-term memory (LSTM), gated recurrent units (GRU), convolutional neural networks (CNN), temporal convolutional networks (TCN), and graph neural networks (GNN), as well as attention-based models and neural differential equations. The survey also covers data-centric approaches, such as improving data quality through sample selection, time-series decomposition, and input space transformation, and increasing data quantity through data augmentation techniques like random and policy-based methods. It discusses self-supervised learning, contrastive learning, and the challenges of handling irregularly sampled time series. The survey concludes with a discussion of promising research directions and provides a guideline for future studies in time-series representation learning.
Reach us at info@study.space
Understanding Universal Time-Series Representation Learning%3A A Survey