Parameter-Efficient Sparse Retrievers and Rerankers Using Adapters

Authors
  • V. Pal
  • C. Lassance
  • H. Déjean
  • S. Clinchant
Publication date 2023
Host editors
  • J. Kamps
  • L. Goeuriot
  • F. Crestani
  • M. Maistro
  • H. Joho
  • B. Davis
  • C. Gurrin
  • U. Kruschwitz
  • A. Caputo
Book title Advances in Information Retrieval
Book subtitle 45th European Conference on Information Retrieval, ECIR 2023, Dublin, Ireland, April 2–6, 2023 : proceedings
ISBN
  • 9783031282379
ISBN (electronic)
  • 9783031282386
Series Lecture Notes in Computer Science
Event 45th European Conference on Information Retrieval
Volume | Issue number II
Pages (from-to) 16-31
Publisher Cham: Springer
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Parameter-Efficient transfer learning with Adapters have been studied in Natural Language Processing (NLP) as an alternative to full fine-tuning. Adapters are memory-efficient and scale well with downstream tasks by training small bottle-neck layers added between transformer layers while keeping the large pretrained language model (PLMs) frozen. In spite of showing promising results in NLP, these methods are under-explored in Information Retrieval. While previous studies have only experimented with dense retriever or in a cross lingual retrieval scenario, in this paper we aim to complete the picture on the use of adapters in IR. First, we study adapters for SPLADE, a sparse retriever, for which adapters not only retain the efficiency and effectiveness otherwise achieved by finetuning, but are memory-efficient and orders of magnitude lighter to train. We observe that Adapters-SPLADE not only optimizes just 2% of training parameters, but outperforms fully fine-tuned counterpart and existing parameter-efficient dense IR models on IR benchmark datasets. Secondly, we address domain adaptation of neural retrieval thanks to adapters on cross-domain BEIR datasets and TripClick. Finally, we also consider knowledge sharing between rerankers and first stage rankers. Overall, our study complete the examination of adapters for neural IR. (The code can be found at: https://github.com/naver/splade/tree/adapter-splade.)
Document type Conference contribution
Language English
Published at https://doi.org/10.1007/978-3-031-28238-6_2
Other links https://github.com/naver/splade/tree/adapter-splade
Permalink to this page
Back