"User Reviews in the Search Index? That'll Never Work!"

Authors
Publication date 2014
Host editors
  • M. de Rijke
  • T. Kenter
  • A.P. de Vries
  • C.X. Zhai
  • F. de Jong
  • K. Radinsky
  • K. Hofmann
Book title Advances in Information Retrieval
Book subtitle 36th European Conference on IR Research, ECIR 2014, Amsterdam, The Netherlands, April 13-16, 2014: proceedings
ISBN
  • 9783319060279
ISBN (electronic)
  • 9783319060286
Series Lecture Notes in Computer Science
Event 36th European Conference on Information Retrieval (ECIR'14)
Pages (from-to) 323-334
Publisher Cham: Springer
Organisations
  • Faculty of Humanities (FGw)
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
Online book search services allow users to tag and review books but do not include such data in the search index, which only contains titles, author names and professional subject descriptors. Such professional metadata is a limited description of the book, whereas tags and reviews can describe the content in more detail and cover many other aspects such as quality, writing style and engagement. In this paper we investigate the impact of including such user-generated content in the search index of a large collection of book records from Amazon and LibraryThing. We find that professional metadata is often too limited to provide good recall and precision and that both user reviews and tags can substantially improve performance. We perform a detailed analysis of different types of metadata and their impact on a number of topic categories and find that user-generated content is effective for a range of information needs. These findings are of direct relevance to large online book sellers and social cataloguing sites.
Document type Conference contribution
Language English
Published at https://doi.org/10.1007/978-3-319-06028-6_27
Permalink to this page
Back