When Language Models Fall in Love: Animacy Processing in Transformer Language Models

Open Access
Authors
Publication date 2023
Host editors
  • H. Bouamar
  • J. Pino
  • K. Bali
Book title The 2023 Conference on Empirical Methods in Natural Language Processing
Book subtitle EMNLP 2023 : Proceedings of the Conference : December 6-10, 2023
ISBN (electronic)
  • 9798891760608
Event 2023 Conference on Empirical Methods in Natural Language Processing, EMNLP 2023
Pages (from-to) 12120-12135
Number of pages 16
Publisher Stroudsburg, PA: Association for Computational Linguistics
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
Animacy—whether an entity is alive and sentient—is fundamental to cognitive processing, impacting areas such as memory, vision, and language. However, animacy is not always expressed directly in language: in English it often manifests indirectly, in the form of selectional constraints on verbs and adjectives. This poses a potential issue for transformer language models (LMs): they often train only on text, and thus lack access to extralinguistic information from which humans learn about animacy. We ask: how does this impact LMs’ animacy processing—do they still behave as humans do? We answer this question using open-source LMs. Like previous studies, we find that LMs behave much like humans when presented with entities whose animacy is typical. However, we also show that even when presented with stories about atypically animate entities, such as a peanut in love, LMs adapt: they treat these entities as animate, though they do not adapt as well as humans. Even when the context indicating atypical animacy is very short, LMs pick up on subtle clues and change their behavior. We conclude that despite the limited signal through which LMs can learn about animacy, they are indeed sensitive to the relevant lexical semantic nuances available in English
Document type Conference contribution
Note With supplementary video
Language English
Published at https://doi.org/10.18653/v1/2023.emnlp-main.744
Other links https://github.com/hannamw/lms-in-love
Downloads
2023.emnlp-main.744 (Final published version)
Supplementary materials
Permalink to this page
Back