Universal Time-Series Representation Learning: A Survey

Universal Time-Series Representation Learning: A Survey

August 2024 | PATARA TRIRAT, YOOJU SHIN, JUNHYEOK KANG, YOUNGEUN NAM, JIHYE NA, MINYOUNG BAE, JOEUN KIM, BYUNGHYUN KIM, and JAE-GIL LEE*
This survey provides a comprehensive overview of universal time-series representation learning, focusing on methods that are effective across multiple downstream tasks. The authors propose a novel taxonomy based on three fundamental elements: data-centric approaches, neural architectural choices, and learning objectives. They review existing studies, discussing their intuitions and insights into enhancing the quality of learned representations. The survey also offers guidelines for experimental setups and benchmark datasets, and identifies open research challenges. Key contributions include an extensive literature review, a novel taxonomy, and discussions on future research directions. The paper highlights the importance of addressing the unique properties of time series data, such as temporal dependencies, high noise, inter-variable relationships, variability, and nonstationarity. It also explores various neural architectures, including RNNs, LSTMs, GRUs, CNNs, GNNs, attention-based networks, and neural differential equations, each with its strengths and limitations. Additionally, the survey covers data-centric approaches like sample selection, time-series decomposition, input space transformation, and data augmentation techniques.This survey provides a comprehensive overview of universal time-series representation learning, focusing on methods that are effective across multiple downstream tasks. The authors propose a novel taxonomy based on three fundamental elements: data-centric approaches, neural architectural choices, and learning objectives. They review existing studies, discussing their intuitions and insights into enhancing the quality of learned representations. The survey also offers guidelines for experimental setups and benchmark datasets, and identifies open research challenges. Key contributions include an extensive literature review, a novel taxonomy, and discussions on future research directions. The paper highlights the importance of addressing the unique properties of time series data, such as temporal dependencies, high noise, inter-variable relationships, variability, and nonstationarity. It also explores various neural architectures, including RNNs, LSTMs, GRUs, CNNs, GNNs, attention-based networks, and neural differential equations, each with its strengths and limitations. Additionally, the survey covers data-centric approaches like sample selection, time-series decomposition, input space transformation, and data augmentation techniques.
Reach us at info@study.space