Team DMG at CMCL 2022 Shared Task: Transformer Adapters for the Multi- and Cross-Lingual Prediction of Human Reading Behavior

Open Access
Authors
Publication date 2022
Host editors
  • E. Chersoni
  • N. Hollenstein
  • C. Jacobs
  • Y. Oseki
  • L. Prévot
  • E. Santus
Book title Workshop on Cognitive Modeling and Computational Linguistics
Book subtitle CMCL 2022 : proceedings of the workshop : May 26, 2022
ISBN (electronic)
  • 9781955917292
Event Workshop on Cognitive Modeling and Computational Linguistics
Pages (from-to) 136-144
Number of pages 9
Publisher Stroudsburg, PA: Association for Computational Linguistics
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
In this paper, we present the details of our approaches that attained the second place in the shared task of the ACL 2022 Cognitive Modeling and Computational Linguistics Workshop. The shared task is focused on multi- and cross-lingual prediction of eye movement features in human reading behavior, which could provide valuable information regarding language processing. To this end, we train `adapters' inserted into the layers of frozen transformer-based pretrained language models. We find that multilingual models equipped with adapters perform well in predicting eye-tracking features. Our results suggest that utilizing language- and task-specific adapters is beneficial and translating test sets into similar languages that exist in the training set could help with zero-shot transferability in the prediction of human reading behavior.
Document type Conference contribution
Language English
Published at https://doi.org/10.18653/v1/2022.cmcl-1.16
Other links https://aclanthology.org/2022.cmcl-1.16.mp4
Downloads
2022.cmcl-1.16 (Final published version)
Permalink to this page
Back