5 Dec 2020 | Hassan Ismail Fawaz1 · Benjamin Lucas2 · Germain Forestier1,2 · Charlotte Pelletier2,3 · Daniel F. Schmidt2 · Jonathan Weber1 · Geoffrey I. Webb2 · Lhassane Idoumghar1 · Pierre-Alain Muller1 · François Petitjean2
InceptionTime: Finding AlexNet for Time Series Classification
This paper introduces InceptionTime, a deep learning model for time series classification (TSC) that achieves state-of-the-art accuracy and scalability. TSC involves categorizing time series data, and recent methods like HIVE-COTE, though accurate, are too slow for large datasets. InceptionTime is an ensemble of five deep convolutional neural networks (CNNs) inspired by the Inception-v4 architecture. It is highly scalable, learning from 1,500 time series in one hour and 8M in 13 hours, far outperforming HIVE-COTE. InceptionTime matches HIVE-COTE's accuracy while being significantly faster. It uses Inception modules that apply multiple filters to extract features from both short and long time series. The model's success is attributed to its architecture, including bottleneck layers and residual connections, which reduce complexity and improve performance. InceptionTime is evaluated on the UCR archive, showing competitive accuracy with HIVE-COTE and significantly faster training times. The paper also analyzes architectural hyperparameters, finding that increasing filter lengths and network depth improves accuracy but may lead to overfitting. InceptionTime's performance is validated on synthetic and real datasets, demonstrating its effectiveness in TSC. The model's scalability and accuracy make it a promising solution for large-scale TSC tasks.InceptionTime: Finding AlexNet for Time Series Classification
This paper introduces InceptionTime, a deep learning model for time series classification (TSC) that achieves state-of-the-art accuracy and scalability. TSC involves categorizing time series data, and recent methods like HIVE-COTE, though accurate, are too slow for large datasets. InceptionTime is an ensemble of five deep convolutional neural networks (CNNs) inspired by the Inception-v4 architecture. It is highly scalable, learning from 1,500 time series in one hour and 8M in 13 hours, far outperforming HIVE-COTE. InceptionTime matches HIVE-COTE's accuracy while being significantly faster. It uses Inception modules that apply multiple filters to extract features from both short and long time series. The model's success is attributed to its architecture, including bottleneck layers and residual connections, which reduce complexity and improve performance. InceptionTime is evaluated on the UCR archive, showing competitive accuracy with HIVE-COTE and significantly faster training times. The paper also analyzes architectural hyperparameters, finding that increasing filter lengths and network depth improves accuracy but may lead to overfitting. InceptionTime's performance is validated on synthetic and real datasets, demonstrating its effectiveness in TSC. The model's scalability and accuracy make it a promising solution for large-scale TSC tasks.