Overview of the Living Labs for Information Retrieval Evaluation (LL4IR) CLEF Lab 2015

Open Access
Authors
Publication date 2015
Host editors
  • J. Mothe
  • J. Savoy
  • J. Kamps
  • K. Pinel-Sauvagnat
  • G.J.F. Jones
  • E. SanJuan
  • L. Cappellato
  • N. Ferro
Book title Experimental IR Meets Multilinguality, Multimodality, and Interaction
Book subtitle 6th International Conference of the CLEF Association, CLEF'15, Toulouse, France, September 8–11, 2015 : proceedings
ISBN
  • 9783319240268
ISBN (electronic)
  • 9783319240275
Series Lecture Notes in Computer Science
Event Conference and Labs of the Evaluation Forum
Pages (from-to) 484-496
Publisher Cham: Springer
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
In this paper we report on the first Living Labs for Information Retrieval Evaluation (LL4IR) CLEF Lab. Our main goal with the lab is to provide a benchmarking platform for researchers to evaluate their ranking systems in a live setting with real users in their natural task environments. For this first edition of the challenge we focused on two specific use-cases: product search and web search. Ranking systems submitted by participants were experimentally compared using interleaved comparisons to the production system from the corresponding use-case. In this paper we describe how these experiments were performed, what the resulting outcomes are, and conclude with some lessons learned.
Document type Conference contribution
Language English
Related publication Extended Overview of the Living Labs for Information Retrieval Evaluation (LL4IR) CLEF Lab 2015
Published at https://doi.org/10.1007/978-3-319-24027-5_47
Downloads
clef2015-ll4ir (Accepted author manuscript)
Permalink to this page
Back