Optimizing Transformer for Low-Resource Neural Machine Translation
| Authors | |
|---|---|
| Publication date | 2020 |
| Host editors |
|
| Book title | The 28th International Conference on Computational Linguistics |
| Book subtitle | COLING 2020 : Proceedings of the Conference : December 8-13, 2020, Barcelona, Spain (Online) |
| ISBN (electronic) |
|
| Event | COLING 2020 |
| Pages (from-to) | 3429-3435 |
| Publisher | International Committee on Computational Linguistics |
| Organisations |
|
| Abstract |
Language pairs with limited amounts of parallel data, also known as low-resource languages, remain a challenge for neural machine translation. While the Transformer model has achieved significant improvements for many language pairs and has become the de facto mainstream architecture, its capability under low-resource conditions has not been fully investigated yet. Our experiments on different subsets of the IWSLT14 training data show that the effectiveness of Transformer under low-resource conditions is highly dependent on the hyper-parameter settings. Our experiments show that using an optimized Transformer for low-resource conditions improves the translation quality up to 7.3 BLEU points compared to using the Transformer default settings.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.18653/v1/2020.coling-main.304 |
| Published at | https://staff.fnwi.uva.nl/c.monz/html/publications/coling2020.pdf |
| Downloads |
2020.coling-main.304
(Final published version)
|
| Permalink to this page | |