How do languages influence each other? Studying cross-lingual data sharing during LM fine-tuning

Open Access
Authors
Publication date 2023
Host editors
  • H. Bouamor
  • J. Pino
  • K. Bali
Book title The 2023 Conference on Empirical Methods in Natural Language Processing
Book subtitle EMNLP 2023 : Proceedings of the Conference : December 6-10, 2023
ISBN (electronic)
  • 9798891760608
Event 2023 Conference on Empirical Methods in Natural Language Processing
Pages (from-to) 13244-13257
Publisher Stroudsburg, PA: Association for Computational Linguistics
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
Multilingual language models (MLMs) are jointly trained on data from many different languages such that representation of individual languages can benefit from other languages’ data. Impressive performance in zero-shot cross-lingual transfer shows that these models are able to exploit this property. Yet, it remains unclear to what extent, and under which conditions, languages rely on each other’s data. To answer this question, we use TracIn (Pruthi et al., 2020), a training data attribution (TDA) method, to retrieve training samples from multilingual data that are most influential for test predictions in a given language. This allows us to analyse cross-lingual sharing mechanisms of MLMs from a new perspective. While previous work studied cross-lingual sharing at the model parameter level, we present the first approach to study it at the data level. We find that MLMs rely on data from multiple languages during fine-tuning and this reliance increases as fine-tuning progresses. We further find that training samples from other languages can both reinforce and complement the knowledge acquired from data of the test language itself.
Document type Conference contribution
Note With supplementary video.
Language English
Published at https://doi.org/10.18653/v1/2023.emnlp-main.818
Downloads
2023.emnlp-main.818 (Final published version)
Supplementary materials
Permalink to this page
Back