Adapting Learned Sparse Retrieval for Long Documents

Open Access
Authors
Publication date 2023
Book title SIGIR '23
Book subtitle Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval : July 23-27, 2023, Taipei, Taiwan
ISBN (electronic)
  • 9781450394086
Event 46th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2023
Pages (from-to) 1781-1785
Publisher New York, NY: Association for Computing Machinery
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Learned sparse retrieval (LSR) is a family of neural retrieval methods that transform queries and documents into sparse weight vectors aligned with a vocabulary. While LSR approaches like Splade work well for short passages, it is unclear how well they handle longer documents. We investigate existing aggregation approaches for adapting LSR to longer documents and find that proximal scoring is crucial for LSR to handle long documents. To leverage this property, we proposed two adaptations of the Sequential Dependence Model (SDM) to LSR: ExactSDM and SoftSDM. ExactSDM assumes only exact query term dependence, while SoftSDM uses potential functions that model the dependence of query terms and their expansion terms (i.e., terms identified using a transformer's masked language modeling head).

Experiments on the MSMARCO Document and TREC Robust04 datasets demonstrate that both ExactSDM and SoftSDM outperform existing LSR aggregation approaches for different document length constraints. Surprisingly, SoftSDM does not provide any performance benefits over ExactSDM. This suggests that soft proximity matching is not necessary for modeling term dependence in LSR. Overall, this study provides insights into handling long documents with LSR, proposing adaptations that improve its performance.
Document type Conference contribution
Language English
Published at https://doi.org/10.1145/3539618.3591943
Downloads
3539618.3591943 (Final published version)
Permalink to this page
Back