Reproducing NevIR Negation in Neural Information Retrieval

Open Access
Authors
  • Coen van den Elsen
  • Francien Barkhof
  • Thijmen Nijdam
  • Simon Lupart
Publication date 2025
Book title SIGIR '25
Book subtitle Proceedings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval : July 13-18, 2025, Padua, Italy
ISBN (electronic)
  • 9798400715921
Event 48th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2025
Pages (from-to) 3346-3356
Number of pages 11
Publisher New York, NY: Association for Computing Machinery
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract

Negation is a fundamental aspect of human communication, yet it remains a challenge for Language Models (LMs) in Information Retrieval (IR). Despite the heavy reliance of modern neural IR systems on LMs, little attention has been given to their handling of negation. In this study, we reproduce and extend the findings of NevIR, a benchmark study that revealed most IR models perform at or below the level of random ranking when dealing with negation. We replicate NevIR’s original experiments and evaluate newly developed state-of-the-art IR models. Our findings show that a recently emerging category-listwise Large Language Model (LLM) re-rankers-outperforms other models but still underperforms human performance. Additionally, we leverage ExcluIR, a benchmark dataset designed for exclusionary queries with extensive negation, to assess the generalisability of negation understanding. Our findings suggest that fine-tuning on one dataset does not reliably improve performance on the other, indicating notable differences in their data distributions. Furthermore, we observe that only cross-encoders and listwise LLM re-rankers achieve reasonable performance across both negation tasks.

Document type Conference contribution
Language English
Published at https://doi.org/10.1145/3726302.3730294
Other links https://www.scopus.com/pages/publications/105011820612
Downloads
3726302.3730294 (Final published version)
Permalink to this page
Back