ERNIE: Enhanced Language Representation with Informative Entities

ERNIE: Enhanced Language Representation with Informative Entities

4 Jun 2019 | Zhengyan Zhang, Xu Han, Zhiyuan Liu, Xin Jiang, Maosong Sun, Qun Liu
The paper introduces ERNIE (Enhanced Language Representation with Informative Entities), a model that integrates large-scale textual corpora and knowledge graphs to enhance language representation. ERNIE aims to leverage external knowledge to improve language understanding, particularly in knowledge-driven tasks such as entity typing and relation classification. The model pre-trains on both text and knowledge graphs, using a masked language model and next sentence prediction objectives, as well as a new pre-training task called denoising entity auto-encoder (dEA) to inject knowledge into the language representation. Experimental results show that ERNIE significantly outperforms BERT on knowledge-driven tasks while maintaining comparable performance on common NLP tasks. The paper also discusses the architecture of ERNIE, its pre-training and fine-tuning procedures, and provides detailed evaluations on various datasets.The paper introduces ERNIE (Enhanced Language Representation with Informative Entities), a model that integrates large-scale textual corpora and knowledge graphs to enhance language representation. ERNIE aims to leverage external knowledge to improve language understanding, particularly in knowledge-driven tasks such as entity typing and relation classification. The model pre-trains on both text and knowledge graphs, using a masked language model and next sentence prediction objectives, as well as a new pre-training task called denoising entity auto-encoder (dEA) to inject knowledge into the language representation. Experimental results show that ERNIE significantly outperforms BERT on knowledge-driven tasks while maintaining comparable performance on common NLP tasks. The paper also discusses the architecture of ERNIE, its pre-training and fine-tuning procedures, and provides detailed evaluations on various datasets.
Reach us at info@study.space
[slides] ERNIE%3A Enhanced Language Representation with Informative Entities | StudySpace