Learning to Learn to Disambiguate: Meta-Learning for Few-Shot Word Sense Disambiguation

Open Access
Authors
Publication date 2020
Host editors
  • T. Cohn
  • Y. He
  • Y. Liu
Book title Findings of the Association for Computational Linguistics : Findings of ACL: EMNLP 2020
Book subtitle 16-20 November, 2020
ISBN (electronic)
  • 9781952148903
Event 2020 Conference on Empirical Methods in Natural Language Processing
Pages (from-to) 4517-4533
Publisher Stroudsburg, PA: The Association for Computational Linguistics
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
The success of deep learning methods hinges on the availability of large training datasets annotated for the task of interest. In contrast to human intelligence, these methods lack versatility and struggle to learn and adapt quickly to new tasks, where labeled data is scarce. Meta-learning aims to solve this problem by training a model on a large number of few-shot tasks, with an objective to learn new tasks quickly from a small number of examples. In this paper, we propose a meta-learning framework for few-shot word sense disambiguation (WSD), where the goal is to learn to disambiguate unseen words from only a few labeled instances. Meta-learning approaches have so far been typically tested in an N-way, K-shot classification setting where each task has N classes with K examples per class. Owing to its nature, WSD deviates from this controlled setup and requires the models to handle a large number of highly unbalanced classes. We extend several popular meta-learning approaches to this scenario, and analyze their strengths and weaknesses in this new challenging setting.
Document type Conference contribution
Note Volume comprises papers selected from those submitted to EMNLP 2020 which were not selected to appear at the main conference.
Language English
Published at https://doi.org/10.18653/v1/2020.findings-emnlp.405
Downloads
2020.findings-emnlp.405 (Final published version)
Permalink to this page
Back