Joint Dropout: Improving Generalizability in Low-Resource Neural Machine Translation through Phrase Pair Variables

Open Access
Authors
Publication date 2023
Host editors
  • M. Utiyama
  • R. Wang
Book title MTS: Machine Translation Summit 2023
Book subtitle September 4-8, 2023, Macau SAR, China : Proceedings of Machine Translation Summit XIX. - Vol. 1: Research Track
Event Machine Translation Summit XIX
Pages (from-to) 12-25
Publisher Asia-Pacific Association for Machine Translation
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Despite the tremendous success of Neural Machine Translation (NMT), its performance on low- resource language pairs still remains subpar, partly due to the limited ability to handle previously unseen inputs, i.e., generalization. In this paper, we propose a method called Joint Dropout, that addresses the challenge of low-resource neural machine translation by substituting phrases with variables, resulting in significant enhancement of compositionality, which is a key aspect of generalization. We observe a substantial improvement in translation quality for language pairs with minimal resources, as seen in BLEU and Direct Assessment scores. Furthermore, we conduct an error analysis, and find Joint Dropout to also enhance generalizability of low-resource NMT in terms of robustness and adaptability across different domains.
Document type Conference contribution
Language English
Published at https://aclanthology.org/2023.mtsummit-research.2
Downloads
2023.mtsummit-research.2 (Final published version)
Permalink to this page
Back