Language Modelling as a Multi-Task Problem

Open Access
Authors
Publication date 2021
Host editors
  • P. Merlo
  • J. Tiedemann
  • R. Tsarfaty
Book title The 16th Conference of the European Chapter of the Association for Computational Linguistics
Book subtitle EACL 2021 : proceedings of the conference : April 19-23, 2021
ISBN (electronic)
  • 9781954085022
Event 16th Conference of the European Chapter of the Association for Computational Linguistics
Pages (from-to) 2049–2060
Publisher Stroudsburg, PA: Association for Computational Linguistics
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
In this paper, we propose to study language modelling as a multi-task problem, bringing together three strands of research: multi-task learning, linguistics, and interpretability. Based on hypotheses derived from linguistic theory, we investigate whether language models adhere to learning principles of multi-task learning during training. To showcase the idea, we analyse the generalisation behaviour of language models as they learn the linguistic concept of Negative Polarity Items (NPIs). Our experiments demonstrate that a multi-task setting naturally emerges within the objective of the more general task of language modelling. We argue that this insight is valuable for multi-task learning, linguistics and interpretability research and can lead to exciting new findings in all three domains.
Document type Conference contribution
Language English
Published at https://doi.org/10.18653/v1/2021.eacl-main.176
Downloads
2021.eacl-main.176 (Final published version)
Permalink to this page
Back