Beyond sentence-level end-to-end speech translation Context helps
| Authors |
|
|---|---|
| Publication date | 2021 |
| Host editors |
|
| Book title | The 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing |
| Book subtitle | ACL-IJCNLP 2021 : proceedings of the conference : August 1-6, 2021 |
| ISBN (electronic) |
|
| Event | The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021) |
| Volume | Issue number | 1 |
| Pages (from-to) | 2566-2578 |
| Number of pages | 13 |
| Publisher | Stroudsburg, PA: The Association for Computational Linguistics |
| Organisations |
|
| Abstract |
Document-level contextual information has shown benefits to text-based machine translation, but whether and how context helps end-to-end (E2E) speech translation (ST) is still under-studied. We fill this gap through extensive experiments using a simple concatenation-based context-aware ST model, paired with adaptive feature selection on speech encodings for computational efficiency. We investigate several decoding approaches, and introduce in-model ensemble decoding which jointly performs document- and sentence-level translation using the same model. Our results on the MuST-C benchmark with Transformer demonstrate the effectiveness of context to E2E ST. Compared to sentence-level ST, context-aware ST obtains better translation quality (+0.18-2.61 BLEU), improves pronoun and homophone translation, shows better robustness to (artificial) audio segmentation errors, and reduces latency and flicker to deliver higher quality for simultaneous translation. |
| Document type | Conference contribution |
| Note | With supplementary video |
| Language | English |
| Published at | https://doi.org/10.18653/v1/2021.acl-long.200 |
| Other links | https://github.com/bzhangGo/zero https://www.scopus.com/pages/publications/85118940388 |
| Downloads |
2021.acl-long.200
(Final published version)
|
| Supplementary materials | |
| Permalink to this page | |