Recurrent Convolutional Neural Networks for Text Classification

Recurrent Convolutional Neural Networks for Text Classification

2015 | Siwei Lai, Liheng Xu, Kang Liu, Jun Zhao
This paper proposes a Recurrent Convolutional Neural Network (RCNN) for text classification, which outperforms traditional methods and state-of-the-art approaches on several datasets. The RCNN combines a bidirectional recurrent structure with a max-pooling layer to capture contextual information and key components in texts. The recurrent structure helps in capturing contextual information with less noise compared to traditional window-based neural networks, while the max-pooling layer automatically identifies important words for classification. The model has a time complexity of O(n), which is linear with respect to the text length. The RCNN is tested on four datasets: 20Newsgroups, Fudan Set, ACL Anthology Network, and Sentiment Treebank. The results show that the RCNN outperforms existing methods, particularly on document-level datasets. The model is effective in capturing semantic information and is suitable for low-resource languages as it does not require handcrafted features. The RCNN also performs better than CNN and RecursiveNN in capturing contextual information, as the recurrent structure can preserve longer contextual information and introduces less noise. The model is evaluated using metrics such as accuracy and Macro-F1, and it achieves competitive results on various tasks including sentiment classification and topic classification. The RCNN is a promising approach for text classification due to its ability to capture contextual information and its efficiency in processing long texts.This paper proposes a Recurrent Convolutional Neural Network (RCNN) for text classification, which outperforms traditional methods and state-of-the-art approaches on several datasets. The RCNN combines a bidirectional recurrent structure with a max-pooling layer to capture contextual information and key components in texts. The recurrent structure helps in capturing contextual information with less noise compared to traditional window-based neural networks, while the max-pooling layer automatically identifies important words for classification. The model has a time complexity of O(n), which is linear with respect to the text length. The RCNN is tested on four datasets: 20Newsgroups, Fudan Set, ACL Anthology Network, and Sentiment Treebank. The results show that the RCNN outperforms existing methods, particularly on document-level datasets. The model is effective in capturing semantic information and is suitable for low-resource languages as it does not require handcrafted features. The RCNN also performs better than CNN and RecursiveNN in capturing contextual information, as the recurrent structure can preserve longer contextual information and introduces less noise. The model is evaluated using metrics such as accuracy and Macro-F1, and it achieves competitive results on various tasks including sentiment classification and topic classification. The RCNN is a promising approach for text classification due to its ability to capture contextual information and its efficiency in processing long texts.
Reach us at info@study.space
Understanding Recurrent Convolutional Neural Networks for Text Classification