DEEP BIAFFINE ATTENTION FOR NEURAL DEPENDENCY PARSING

DEEP BIAFFINE ATTENTION FOR NEURAL DEPENDENCY PARSING

10 Mar 2017 | Timothy Dozat, Christopher D. Manning
This paper presents a modified version of the neural dependency parser proposed by Kiperwasser & Goldberg (2016), using deep biaffine attention to improve parsing performance. The authors build a larger and more regularized parser with biaffine classifiers for predicting arcs and labels, achieving state-of-the-art or near-state-of-the-art performance on standard treebanks for six different languages. The parser outperforms other graph-based approaches and is comparable to the best transition-based parser. The paper also discusses hyperparameter choices that significantly affect parsing accuracy, demonstrating large gains over other graph-based methods. The proposed model maintains the simplicity of neural graph-based approaches while approaching the performance of the SOFA transition-based parser. The authors further explore the impact of different architectures and hyperparameters, providing empirical evidence to support their approach.This paper presents a modified version of the neural dependency parser proposed by Kiperwasser & Goldberg (2016), using deep biaffine attention to improve parsing performance. The authors build a larger and more regularized parser with biaffine classifiers for predicting arcs and labels, achieving state-of-the-art or near-state-of-the-art performance on standard treebanks for six different languages. The parser outperforms other graph-based approaches and is comparable to the best transition-based parser. The paper also discusses hyperparameter choices that significantly affect parsing accuracy, demonstrating large gains over other graph-based methods. The proposed model maintains the simplicity of neural graph-based approaches while approaching the performance of the SOFA transition-based parser. The authors further explore the impact of different architectures and hyperparameters, providing empirical evidence to support their approach.
Reach us at info@study.space
Understanding Deep Biaffine Attention for Neural Dependency Parsing