Joint Cultural and Grammatical Correction of Japanese Translations via a GAN-BiLSTM Framework with Policy Gradient Optimization

Abstract

Japanese translations often exhibit grammatical deviations and cultural inconsistencies. This study proposes an automatic correction framework that integrates a Generative Adversarial Network with a Bi-LSTM encoder–decoder augmented by embedded cultural vectors to generate correction candidates. A convolutional discriminator jointly evaluates grammaticality and cultural appropriateness under a Wasserstein adversarial objective, while policy gradient optimization refines the generator using discriminator-based rewards. Semantic consistency is preserved through back-translation constraints. Experiments on 60,000 culturally annotated Japanese–Chinese sentence pairs from news, business, and academic domains, split into training, validation, and testing sets, compare the proposed method with RNN, Transformer, and GAN-only baselines. The model achieves 86.2 percent grammatical accuracy, a cultural alignment score of 4.52, a fluency score of 4.10, and a perplexity of 45.6, consistently outperforming baselines across sentence lengths and cultural load levels. The results demonstrate unified optimization of grammatical correction, cultural alignment, and semantic preservation in Japanese translation post-editing.

References

Reference

.Cao Z. A Comparative Analysis of Japanese Learners' Translation Bias Using Neurosemantic Analysis. Applied Mathematics and Nonlinear Sciences. Sciendo, 2024;9}(1): https://doi.org/10.2478/amns-2024-0550

https://doi.org/10.2478/amns-2024-0550

.Y , Li Result score too low

.Koizumi R, In'nami Y. Modeling complexity, accuracy, and fluency of Japanese learners of English: A structural equation modeling approach[J]. JALT journal, 2014, 36(1): 25.

https://doi.org/10.37546/jaltjj36.1-2

.Povoroznyuk R, Pocheniuk I, Gaidash A, et al. Neuropedagogical guidelines for translation studies: Perceiving the linguistic-cultural markers of the other (foreign) in translation[J]. Revista Romaneasca pentru Educatie Multidimensionala, 2024, 16(4): 185-209.

https://doi.org/10.18662/rrem/16.4/912

.Wei Y. Analysis of cross-cultural education in Japanese teaching based on multimedia technology[J]. Computer-Aided Design and Applications, 2023, 20(S12): 37-56.

https://doi.org/10.14733/cadaps.2023.s12.37-56

.Bryant C, Yuan Z, Qorib MR, et al. Grammatical error correction: A survey of the state of the art[J]. Computational Linguistics, 2023, 49(3): 643-701.

https://doi.org/10.1162/coli_a_00478

.Yang M, Li F. Improving Machine Translation Formality with Large Language Models[J]. Computers, Materials & Continua, 2025, 82(2).

https://doi.org/10.32604/cmc.2024.058248

.Chiba Y, Higashinaka R. Analyzing variations of everyday Japanese conversations based on semantic labels of functional expressions[J]. ACM Transactions on Asian and Low-Resource Language Information Processing, 2023, 22(2): 1-26.

https://doi.org/10.1145/3552310

.Tarumoto S, Hatagaki K, Miyata R, et al. Evaluating ChatGPT's ability to generate Japanese[J]. J. Nat. Lang. Process, 2024, 31(2): 349-373.

https://doi.org/10.5715/jnlp.31.349

.Tomuki Miyamoto, Nozomi Nagai, Yuuto Satada, etc. Risky Politeness Strategy [J]. Journal of the Artificial Intelligence Society, 2022, 37(3): IDS-G_1-16. Result score too low

.Son J, Kim B. Translation performance from the user's perspective of large language models and neural machine translation systems[J]. Information, 2023, 14(10): 574.

https://doi.org/10.3390/info14100574

.Boluwatife O S. Cultural Nuances in Translation: AI vs Human Translators[J]. 2025. Result score too low

.Zhang Y, Kamigaito H, Okumura M. Bidirectional transformer reranker for grammatical error correction[J]. Journal of Natural Language Processing, 2024, 31(1): 3-46.

https://doi.org/10.5715/jnlp.31.3

.Ying Z, Hidetaka K, Manabu O. Bidirectional Transformer Reranker for Grammatical Error Correction[J]. Natural Language Processing, 2024, 31(1): 3-46.

https://doi.org/10.5715/jnlp.31.3

.Zan C, Ding L, Shen L, et al. Building accurate translation-tailored large language models with language-aware instruction tuning[J]. Frontiers of Information Technology & Electronic Engineering, 2025, 26(8): 1341-1355.

https://doi.org/10.1631/fitee.2400458

.Mahsuli MM, Khadivi S, Homayounpour M M. Lenm: improving low-resource neural machine translation using target length modeling[J]. Neural processing letters, 2023, 55(7): 9435-9466.

https://doi.org/10.1007/s11063-023-11208-1

.Lin JCW, DÍaz VG Í, Molinera JA M. Introduction to the special issue of recent advances in computational linguistics for Asian languages[J]. ACM Transactions on Asian and Low-Resource Language Information Processing, 2023, 22(3): 1-5.

https://doi.org/10.1145/3588316

.Katushemererwe F, Caines A, Buttery P. Building natural language processing tools for Runyakitara[J]. Applied Linguistics Review, 2021, 12(4): 585-609.

https://doi.org/10.1515/applirev-2020-2004

.Al Sharoufi H, Al-Fadhli W S. Bridging the Gap: Pragmatic and Cultural Challenges in Machine Translation[J]. 2025. Result score too low

.Ishida T, Murakami Y. Impact of Large Language Models on Conversational Translation[J]. Authorea Preprints, 2024.

https://doi.org/10.36227/techrxiv.173014408.89705418/v1

.Ding L, Wang L, Liu S. Recurrent graph encoder for syntax-aware neural machine translation[J]. International Journal of Machine Learning and Cybernetics, 2023, 14(4): 1053-1062.

https://doi.org/10.1007/s13042-022-01682-9

.Li Z, Parnow K, Zhao H. Incorporating rich syntax information in Grammatical Error Correction[J]. Information Processing & Management, 2022, 59(3): 102891.

https://doi.org/10.1016/j.ipm.2022.102891

.Huang X. Research on error detection in English translation texts using machine learning algorithms[J]. Intelligent Decision Technologies, 2024, 18(2): 1403-1409.

https://doi.org/10.3233/idt-240111

.Xiang Y, Chen Y, Fan W, et al. Enhancing computer-aided translation system with BiLSTM and convolutional neural network using a knowledge graph approach[J]. The Journal of Supercomputing, 2024, 80(5): 5847-5869.

https://doi.org/10.1007/s11227-023-05686-2

.Chen S, Xiao Y. An intelligent error correction model for English grammar with hybrid attention mechanism and RNN algorithm[J]. Journal of Intelligent Systems, 2024, 33(1): 20230170.

https://doi.org/10.1515/jisys-2023-0170

.Yi̇rmi̇beşoğlu Z, Güngör T. Morphologically Motivated Input Variations and Data Augmentation in Turkish-English Neural Machine Translation[J]. ACM Transactions on Asian and Low-Resource Language Information Processing, 2023, 22(3): 1-31.

https://doi.org/10.1145/3571073

.Li C, Teney D, Yang L, et al. Culturepark: Boosting cross-cultural understanding in large language models[J]. Advances in Neural Information Processing Systems, 2024, 37: 65183-65216. Result score too low

.Jiao Z, Ren F. WRGAN: Improvement of RelGAN with Wasserstein loss for text generation[J]. Electronics, 2021, 10(3): 275.

https://doi.org/10.3390/electronics10030275

.Pang J, Yang* B, Wong* DF, et al. Rethinking the exploitation of monolingual data for low-resource neural machine translation[J]. Computational Linguistics, 2024, 50(1): 25-47.

https://doi.org/10.1162/coli_a_00496

.Hao Y, Liu Y, Mou L. Teacher forcing rewards functions for text generation[J]. Advances in Neural Information Processing Systems, 2022, 35: 12594-12607.

Authors

  • Shaonan Lin Foreign Language Institute,Fujian Polytechnic Normal University, Fuqing, Fujian 350300,China

DOI:

https://doi.org/10.31449/inf.v50i10.12257

Downloads

Published

03/18/2026

How to Cite

Lin, S. (2026). Joint Cultural and Grammatical Correction of Japanese Translations via a GAN-BiLSTM Framework with Policy Gradient Optimization. Informatica, 50(10). https://doi.org/10.31449/inf.v50i10.12257