The Functional Relevance of Probed Information: A Case Study

Open Access
Authors
Publication date 2023
Host editors
  • A. Vlachos
  • I. Augenstein
Book title The 17th Conference of the European Chapter of the Association for Computational Linguistics
Book subtitle EACL 2023 : proceedings of the conference : May 2-6, 2023
ISBN (electronic)
  • 9781959429449
Pages (from-to) 835-848
Publisher Stroudsburg, PA: Association for Computational Linguistics
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
Recent studies have shown that transformer models like BERT rely on number information encoded in their representations of sentences’ subjects and head verbs when performing subject-verb agreement. However, probing experiments suggest that subject number is also encoded in the representations of all words in such sentences. In this paper, we use causal interventions to show that BERT only uses the subject plurality information encoded in its representations of the subject and words that agree with it in number. We also demonstrate that current probing metrics are unable to determine which words’ representations contain functionally relevant information. This both provides a revised view of subject-verb agreement in language models, and suggests potential pitfalls for current probe usage and evaluation.
Document type Conference contribution
Note With supplementary video
Language English
Published at https://doi.org/10.18653/v1/2023.eacl-main.58
Downloads
2023.eacl-main.58 (Final published version)
Supplementary materials
Permalink to this page
Back