A Multi-task Shared Cascade Learning for Aspect Sentiment Triplet Extraction Using BERT-MRC

A Multi-task Shared Cascade Learning for Aspect Sentiment Triplet Extraction Using BERT-MRC

29 January 2024 | Wang Zou, Wubo Zhang, Wenhuan Wu, Zhuoyan Tian
This paper proposes a multi-task shared cascade learning framework based on machine reading comprehension (MRC) for aspect sentiment triplet extraction (Triplet). The framework, named Triple-MRC, addresses the limitations of existing methods, such as contribution distribution among components and error propagation in pipeline-based approaches. The proposed method leverages the prior knowledge from the question to reduce error propagation and mitigate the limitations of model complexity. The Triple-MRC framework is evaluated on two publicly available benchmark datasets for the Triplet task. The experimental results show that Triple-MRC outperforms the baseline model in terms of performance. Through analysis of comparison studies, model training processes, error analysis, ablation studies, attention visualization, and case studies, the effectiveness of the multi-task shared cascade learning method and MRC approach is confirmed. The main contributions of this paper include: (1) integrating multi-task learning and MRC into the Triplet task, and combining the advantages of multi-task shared parameter learning and multi-task cascade learning; (2) introducing a multi-task shared cascade learning model based on MRC, called Triple-MRC, which divides the Triplet task into three subtasks: aspect term extraction (AE), aspect-oriented opinion extraction (AOE), and aspect-level sentiment classification (ASC), using three BERT-MRC models with shared parameters to execute all subtasks; and (3) conducting a comparison study, error analysis, ablation experiment, attention visualization study, and case study to demonstrate the performance of the proposed model. The experimental results indicate that Triple-MRC outperforms the current state-of-the-art methods.This paper proposes a multi-task shared cascade learning framework based on machine reading comprehension (MRC) for aspect sentiment triplet extraction (Triplet). The framework, named Triple-MRC, addresses the limitations of existing methods, such as contribution distribution among components and error propagation in pipeline-based approaches. The proposed method leverages the prior knowledge from the question to reduce error propagation and mitigate the limitations of model complexity. The Triple-MRC framework is evaluated on two publicly available benchmark datasets for the Triplet task. The experimental results show that Triple-MRC outperforms the baseline model in terms of performance. Through analysis of comparison studies, model training processes, error analysis, ablation studies, attention visualization, and case studies, the effectiveness of the multi-task shared cascade learning method and MRC approach is confirmed. The main contributions of this paper include: (1) integrating multi-task learning and MRC into the Triplet task, and combining the advantages of multi-task shared parameter learning and multi-task cascade learning; (2) introducing a multi-task shared cascade learning model based on MRC, called Triple-MRC, which divides the Triplet task into three subtasks: aspect term extraction (AE), aspect-oriented opinion extraction (AOE), and aspect-level sentiment classification (ASC), using three BERT-MRC models with shared parameters to execute all subtasks; and (3) conducting a comparison study, error analysis, ablation experiment, attention visualization study, and case study to demonstrate the performance of the proposed model. The experimental results indicate that Triple-MRC outperforms the current state-of-the-art methods.
Reach us at info@futurestudyspace.com
[slides and audio] A Multi-task Shared Cascade Learning for Aspect Sentiment Triplet Extraction Using BERT-MRC