Meta-learning for fast cross-lingual adaptation in dependency parsing
| Authors |
|
|---|---|
| Publication date | 2022 |
| Host editors |
|
| Book title | The 60th Annual Meeting of the Association for Computational Linguistics |
| Book subtitle | ACL 2022 : proceedings of the conference : May 22-27, 2022 |
| ISBN (electronic) |
|
| Event | 60th Annual Meeting of the Association for Computational Linguistics |
| Volume | Issue number | 1 |
| Pages (from-to) | 8503–8520 |
| Publisher | Stroudsburg, PA: Association for Computational Linguistics |
| Organisations |
|
| Abstract |
Meta-learning, or learning to learn, is a technique that can help to overcome resource scarcity in cross-lingual NLP problems, by enabling fast adaptation to new tasks. We apply model-agnostic meta-learning (MAML) to the task of cross-lingual dependency parsing. We train our model on a diverse set of languages to learn a parameter initialization that can adapt quickly to new languages. We find that meta-learning with pre-training can significantly improve upon the performance of language transfer and standard supervised learning baselines for a variety of unseen, typologically diverse, and low-resource languages, in a few-shot learning setup.
|
| Document type | Conference contribution |
| Note | With software and video. |
| Language | English |
| Published at | https://doi.org/10.48550/arXiv.2104.04736 https://doi.org/10.18653/v1/2022.acl-long.582 |
| Other links | https://github.com/annaproxy/udify-metalearning |
| Downloads |
2022.acl-long.582
(Final published version)
|
| Supplementary materials | |
| Permalink to this page | |
