Accelerated Convergence for Counterfactual Learning to Rank

Open Access
Authors
Publication date 2020
Book title SIGIR '20
Book subtitle proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval : July 25-30, 2020, virtual event, China
ISBN (electronic)
  • 9781450380164
Event 43rd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2020
Pages (from-to) 469–478
Publisher New York, NY: Association for Computing Machinery
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Counterfactual Learning To Rank (LTR) algorithms learn a ranking model from logged user interactions, often collected using a production system. Employing such an offline learning approach has many benefits compared to an online one, but it is challenging as user feedback often contains high levels of bias. Unbiased LTR uses Inverse Propensity Scoring (IPS) to enable unbiased learning from logged user interactions. One of the major difficulties in applying Stochastic Gradient Descent (SGD) approaches to counterfactual learning problems is the large variance introduced by the propensity weights. In this paper we show that the convergence rate of SGD approaches with IPS-weighted gradients suffers from the large variance introduced by the IPS weights: convergence is slow, especially when there are large IPS weights.
To overcome this limitation, we propose a novel learning algorithm, called CounterSample, that has provably better convergence than standard IPS-weighted gradient descent methods. We prove that CounterSample converges faster and complement our theoretical findings with empirical results by performing extensive experimentation in a number of biased LTR scenarios -- across optimizers, batch sizes, and different degrees of position bias.
Document type Conference contribution
Note With supplemental material
Language English
Published at https://doi.org/10.1145/3397271.3401069
Other links http://github.com/rjagerman/sigir2020
Downloads
jagerman-2020-accelerated (Accepted author manuscript)
Permalink to this page
Back