A Multi-task Shared Cascade Learning for Aspect Sentiment Triplet Extraction Using BERT-MRC

A Multi-task Shared Cascade Learning for Aspect Sentiment Triplet Extraction Using BERT-MRC

Accepted: 6 January 2024 / Published online: 29 January 2024 | Wang Zou, Wubo Zhang, Wenhuan Wu, Zhuoyan Tian
The paper introduces a novel framework called Triple-MRC, which combines multi-task shared cascade learning and machine reading comprehension (MRC) to address the aspect sentiment triplet extraction (Triplet) task. The Triplet task involves extracting aspect terms (AE), aspect-oriented opinion terms (AOE), and aspect-level sentiment polarity (ASC) from comments. Traditional end-to-end and pipeline approaches have limitations, such as contribution distribution and error propagation, respectively. Triple-MRC leverages the BERT-MRC model to perform these tasks individually through a question-answer (QA) training methodology. The multi-task shared parameter learning and multi-task cascade learning techniques enhance the model's performance by avoiding contribution distribution issues and mitigating error propagation. The model also incorporates dependency syntactic features to better understand sentence structures. Experimental results on two benchmark datasets demonstrate that Triple-MRC outperforms existing methods, validating its effectiveness through various analyses, including comparison studies, error analysis, ablation experiments, attention visualization, and case studies.The paper introduces a novel framework called Triple-MRC, which combines multi-task shared cascade learning and machine reading comprehension (MRC) to address the aspect sentiment triplet extraction (Triplet) task. The Triplet task involves extracting aspect terms (AE), aspect-oriented opinion terms (AOE), and aspect-level sentiment polarity (ASC) from comments. Traditional end-to-end and pipeline approaches have limitations, such as contribution distribution and error propagation, respectively. Triple-MRC leverages the BERT-MRC model to perform these tasks individually through a question-answer (QA) training methodology. The multi-task shared parameter learning and multi-task cascade learning techniques enhance the model's performance by avoiding contribution distribution issues and mitigating error propagation. The model also incorporates dependency syntactic features to better understand sentence structures. Experimental results on two benchmark datasets demonstrate that Triple-MRC outperforms existing methods, validating its effectiveness through various analyses, including comparison studies, error analysis, ablation experiments, attention visualization, and case studies.
Reach us at info@study.space
[slides] A Multi-task Shared Cascade Learning for Aspect Sentiment Triplet Extraction Using BERT-MRC | StudySpace