30 Nov 2015 | Chunting Zhou, Chonglin Sun, Zhiyuan Liu, Francis C.M. Lau
The paper introduces a novel neural network model called C-LSTM, which combines the strengths of Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM) to improve sentence representation and text classification tasks. C-LSTM first uses a CNN to extract higher-level phrase representations from a sentence, which are then fed into an LSTM to capture both local and global sentence semantics. The model is evaluated on sentiment classification and question classification tasks, showing superior performance compared to both CNN and LSTM models individually. The experimental results demonstrate that C-LSTM can effectively learn long-term dependencies and capture complex sentence structures, making it a promising approach for natural language processing tasks.The paper introduces a novel neural network model called C-LSTM, which combines the strengths of Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM) to improve sentence representation and text classification tasks. C-LSTM first uses a CNN to extract higher-level phrase representations from a sentence, which are then fed into an LSTM to capture both local and global sentence semantics. The model is evaluated on sentiment classification and question classification tasks, showing superior performance compared to both CNN and LSTM models individually. The experimental results demonstrate that C-LSTM can effectively learn long-term dependencies and capture complex sentence structures, making it a promising approach for natural language processing tasks.