Multileave Gradient Descent for Fast Online Learning to Rank
| Authors |
|
|---|---|
| Publication date | 2016 |
| Book title | WSDM'16 |
| Book subtitle | proceedings of the Ninth ACM International Conference on Web Search and Data Mining : February 22-25, 2016, San Francisco, CA, USA |
| ISBN |
|
| ISBN (electronic) |
|
| Event | WSDM 2016: The 9th International Conference on Web Search and Data Mining |
| Pages (from-to) | 457-466 |
| Publisher | New York, NY: Association for Computing Machinery |
| Organisations |
|
| Abstract |
Modern search systems are based on dozens or even hundreds of ranking features. The dueling bandit gradient descent (DBGD) algorithm has been shown to effectively learn combinations of these features solely from user interactions. DBGD explores the search space by comparing a possibly improved ranker to the current production ranker. To this end, it uses interleaved comparison methods, which can infer with high sensitivity a preference between two rankings based only on interaction data. A limiting factor is that it can compare only to a single exploratory ranker. We propose an online learning to rank algorithm called multileave gradient descent (MGD) that extends DBGD to learn from so-called multileaved comparison methods that can compare a set of rankings instead of merely a pair. We show experimentally that MGD allows for better selection of candidates than DBGD without the need for more comparisons involving users. An important implication of our results is that orders of magnitude less user interaction data is required to find good rankers when multileaved comparisons are used within online learning to rank. Hence, fewer users need to be exposed to possibly inferior rankers and our method allows search engines to adapt more quickly to changes in user preferences.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.1145/2835776.2835804 |
| Downloads |
wsdm2016-multileave-gradient-descent
(Accepted author manuscript)
|
| Permalink to this page | |
