14 Dec 2016 | Zhiguang Wang, Weizhong Yan, Tim Oates
The paper proposes a simple yet strong baseline for time series classification using deep neural networks, focusing on end-to-end models without heavy preprocessing or feature engineering. The authors evaluate three deep neural network architectures—Multilayer Perceptrons (MLP), Fully Convolutional Networks (FCN), and Residual Networks (ResNet)—on 44 benchmark datasets from the UCR time series repository. The FCN and ResNet achieve competitive performance, with the FCN showing premium performance over other state-of-the-art approaches. The global average pooling layer in the FCN enables the exploitation of Class Activation Maps (CAM) to identify the contributing regions in the raw data for specific labels. The paper also introduces a Mean Per-Class Error (MPCE) metric to evaluate the classification performance, which is more robust than traditional metrics. The results show that the proposed models generalize well and provide a strong baseline for real-world applications and future research.The paper proposes a simple yet strong baseline for time series classification using deep neural networks, focusing on end-to-end models without heavy preprocessing or feature engineering. The authors evaluate three deep neural network architectures—Multilayer Perceptrons (MLP), Fully Convolutional Networks (FCN), and Residual Networks (ResNet)—on 44 benchmark datasets from the UCR time series repository. The FCN and ResNet achieve competitive performance, with the FCN showing premium performance over other state-of-the-art approaches. The global average pooling layer in the FCN enables the exploitation of Class Activation Maps (CAM) to identify the contributing regions in the raw data for specific labels. The paper also introduces a Mean Per-Class Error (MPCE) metric to evaluate the classification performance, which is more robust than traditional metrics. The results show that the proposed models generalize well and provide a strong baseline for real-world applications and future research.