Multi-Task Chinese Grammatical Error Detection via BERT-Gram and Syntactic Dependency GNN Fusion
Abstract
With the continuous development of natural language processing technology, grammatical error detection has become a research hotspot. Traditional methods have limitations in complex sentence structure and generalization ability. Therefore, this study proposes a multi-task learning framework for grammatical error detection that integrates BERT-Gram and graph neural networks to improve detection accuracy and efficiency. BERT-Gram is good at semantic understanding, and graph neural networks are good at processing structured data. The combination of the two can give full play to their respective advantages. The proposed multi-task learning framework integrates BERT-Gram with syntactic dependency graph neural networks, leveraging joint optimization of error detection and correction tasks to enhance semantic and syntactic modeling, and validates its effectiveness through F0.5, precision, and recall metrics on benchmark datasets. In this study, we propose a multi-task learning framework for grammatical error detection fusing BERT-Gram and graph neural network, and verify the performance with F0.5 score, precision/recall and other indicators on NUCLE and Lang-8 datasets through joint optimization error detection and correction tasks, so as to realize the synergy between semantic representation of BERT-Gram and GNN syntactic modeling.
Full Text:
PDFDOI: https://doi.org/10.31449/inf.v49i31.10252
This work is licensed under a Creative Commons Attribution 3.0 License.








