Beyond Reproducibility Advancing Zero-shot LLM Reranking Efficiency with Setwise Insertion

Open Access
Authors
Publication date 2025
Book title SIGIR '25
Book subtitle Proceedings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval : July 13-18, 2025, Padua, Italy
ISBN (electronic)
  • 9798400715921
Event 48th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2025
Pages (from-to) 3205-3213
Number of pages 9
Publisher New York, NY: Association for Computing Machinery
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract

This study presents a comprehensive reproducibility analysis and extension of the Setwise prompting method for zero-shot ranking with Large Language Models (LLMs), as proposed by Zhuang et al. We evaluate the method’s effectiveness and efficiency compared to traditional Pointwise, Pairwise, and Listwise approaches in document ranking tasks. Our reproduction confirms the findings of Zhuang et al., highlighting the trade-offs between computational efficiency and ranking effectiveness in Setwise methods. Building on these insights, we introduce Setwise Insertion, a novel approach that leverages the initial document ranking as prior knowledge, reducing unnecessary comparisons and uncertainty by prioritizing candidates more likely to improve the ranking results. Experimental results across multiple LLM architectures - Flan-T5, Vicuna, and Llama - show that Setwise Insertion yields a 31% reduction in query time, a 23% reduction in model inferences, and a slight improvement in reranking effectiveness compared to the original Setwise method. These findings highlight the practical advantage of incorporating prior ranking knowledge into Setwise prompting for efficient and accurate zero-shot document reranking.

Document type Conference contribution
Language English
Published at https://doi.org/10.1145/3726302.3730323
Other links https://www.scopus.com/pages/publications/105011824921
Downloads
3726302.3730323 (Final published version)
Permalink to this page
Back