Language Modeling, Lexical Translation, Reordering The Training Process of NMT through the Lens of Classical SMT
| Authors | |
|---|---|
| Publication date | 2021 |
| Host editors |
|
| Book title | 2021 Conference on Empirical Methods in Natural Language Processing |
| Book subtitle | EMNLP 2021 : proceedings of the conference : November 7-11, 2021 |
| ISBN (electronic) |
|
| Event | 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021 |
| Pages (from-to) | 8478-8491 |
| Number of pages | 14 |
| Publisher | Stroudsburg, PA: The Association for Computational Linguistics |
| Organisations |
|
| Abstract |
Differently from the traditional statistical MT that decomposes the translation task into distinct separately learned components, neural machine translation uses a single neural network to model the entire translation process. Despite neural machine translation being de-facto standard, it is still not clear how NMT models acquire different competences over the course of training, and how this mirrors the different models in traditional SMT. In this work, we look at the competences related to three core SMT components and find that during training, NMT first focuses on learning target-side language modeling, then improves translation quality approaching word-by-word translation, and finally learns more complicated reordering patterns. We show that this behavior holds for several models and language pairs. Additionally, we explain how such an understanding of the training process can be useful in practice and, as an example, show how it can be used to improve vanilla non-autoregressive neural machine translation by guiding teacher model selection. |
| Document type | Conference contribution |
| Note | With supplementary video. |
| Language | English |
| Published at | https://doi.org/10.18653/v1/2021.emnlp-main.667 |
| Other links | https://www.scopus.com/pages/publications/85123479509 |
| Downloads |
2021.emnlp-main.667
(Final published version)
|
| Supplementary materials | |
| Permalink to this page | |