Better Word Representations with Recursive Neural Networks for Morphology

Better Word Representations with Recursive Neural Networks for Morphology

August 8-9 2013 | Minh-Thang Luong, Richard Socher, Christopher D. Manning
This paper addresses the limitations of existing word representations, which often fail to capture the relationships among morphologically related words, leading to poor estimates of rare and complex words. The authors propose a novel model that combines recursive neural networks (RNNs) and neural language models (NLMs) to build representations for morphologically complex words from their morphemes. By integrating contextual information, the model learns morphemic semantics and compositional properties, enabling it to handle unseen words and improve performance on word similarity tasks across various datasets. The authors also introduce a new dataset focusing on rare words to complement existing datasets. The experimental results show that the proposed model outperforms existing word representations, demonstrating its effectiveness in capturing both syntactic and semantic information.This paper addresses the limitations of existing word representations, which often fail to capture the relationships among morphologically related words, leading to poor estimates of rare and complex words. The authors propose a novel model that combines recursive neural networks (RNNs) and neural language models (NLMs) to build representations for morphologically complex words from their morphemes. By integrating contextual information, the model learns morphemic semantics and compositional properties, enabling it to handle unseen words and improve performance on word similarity tasks across various datasets. The authors also introduce a new dataset focusing on rare words to complement existing datasets. The experimental results show that the proposed model outperforms existing word representations, demonstrating its effectiveness in capturing both syntactic and semantic information.
Reach us at info@study.space