Studying Topical Relevance with Evidence-based Crowdsourcing

Open Access
Authors
  • Z. Szlávik
  • E. Simperl
  • E. Kanoulas
  • L. Aroyo
Publication date 2018
Book title CIKM '18
Book subtitle proceedings of the 2018 ACM International Conference on Information and Knowledge Management : October 22-26, 2018, Torino, Italy
ISBN (electronic)
  • 9781450360142
Event 27th ACM International Conference on Information and Knowledge Management
Pages (from-to) 1253-1262
Number of pages 10
Publisher New York, NY: The Association for Computing Machinery
Organisations
  • Faculty of Economics and Business (FEB) - Amsterdam Business School Research Institute (ABS-RI)
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Information Retrieval systems rely on large test collections to measure their effectiveness in retrieving relevant documents. While the demand is high, the task of creating such test collections is laborious due to the large amounts of data that need to be annotated, and due to the intrinsic subjectivity of the task itself. In this paper we study the topical relevance from a user perspective by addressing the problems of subjectivity and ambiguity. We compare our approach and results with the established TREC annotation guidelines and results. The comparison is based on a series of crowdsourcing pilots experimenting with variables, such as relevance scale, document granularity, annotation template and the number of workers. Our results show correlation between relevance assessment accuracy and smaller document granularity, i.e., aggregation of relevance on paragraph level results in a better relevance accuracy, compared to assessment done at the level of the full document. As expected, our results also show that collecting binary relevance judgments results in a higher accuracy compared to the ternary scale used in the TREC annotation guidelines. Finally, the crowdsourced annotation tasks provided a more accurate document relevance ranking than a single assessor relevance label. This work resulted is a reliable test collection around the TREC Common Core track.
Document type Conference contribution
Language English
Published at https://doi.org/10.1145/3269206.3271779
Downloads
p1253-inel (Final published version)
Permalink to this page
Back