This paper proposes the integration of Long Short Term Memory (LSTM) recurrent neural networks into fully convolutional networks (FCNs) for enhancing time series classification performance. The authors introduce two models: LSTM-FCN and ALSTM-FCN, which incorporate LSTM sub-modules to improve the capabilities of FCNs. The proposed models achieve state-of-the-art performance on the University of California Riverside (UCR) benchmark datasets, outperforming existing methods with minimal preprocessing and a nominal increase in model size. The attention mechanism is also explored to visualize the decision process of the LSTM cells, and fine-tuning is proposed as a method to further enhance model performance. The paper provides a detailed analysis of the proposed models, their architecture, and experimental results, demonstrating their effectiveness and superiority over other techniques.This paper proposes the integration of Long Short Term Memory (LSTM) recurrent neural networks into fully convolutional networks (FCNs) for enhancing time series classification performance. The authors introduce two models: LSTM-FCN and ALSTM-FCN, which incorporate LSTM sub-modules to improve the capabilities of FCNs. The proposed models achieve state-of-the-art performance on the University of California Riverside (UCR) benchmark datasets, outperforming existing methods with minimal preprocessing and a nominal increase in model size. The attention mechanism is also explored to visualize the decision process of the LSTM cells, and fine-tuning is proposed as a method to further enhance model performance. The paper provides a detailed analysis of the proposed models, their architecture, and experimental results, demonstrating their effectiveness and superiority over other techniques.