Learning New Tasks from a Few Examples with Soft-Label Prototypes
| Authors |
|
|---|---|
| Publication date | 2024 |
| Host editors |
|
| Book title | The 4th Workshop on Multilingual Representation Learning : proceedings of the workshop |
| Book subtitle | MRL 2024 : November 16, 2024 |
| ISBN (electronic) |
|
| Event | 4th Workshop on Multilingual Representation Learning |
| Pages (from-to) | 215-236 |
| Publisher | Kerrville, TX: Association for Computational Linguistics |
| Organisations |
|
| Abstract |
Existing approaches to few-shot learning in NLP rely on large language models (LLMs) and/or fine-tuning of these to generalise on out-of-distribution data. In this work, we propose a novel few-shot learning approach based on soft-label prototypes (SLPs) designed to collectively capture the distribution of different classes across the input domain space. We focus on learning previously unseen NLP tasks from very few examples (4, 8, 16) per class and experimentally demonstrate that our approach achieves superior performance on the majority of tested tasks in this data-lean setting while being highly parameter efficient. We also show that our few-shot adaptation method can be integrated into more generalised learning settings, primarily meta-learning, to yield superior performance against strong baselines.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.18653/v1/2024.repl4nlp-1.16 |
| Downloads |
2024.repl4nlp-1.16
(Final published version)
|
| Permalink to this page | |