Analysing the Effect of Clarifying Questions on Document Ranking in Conversational Search
| Authors | |
|---|---|
| Publication date | 2020 |
| Book title | ICTIR'20 |
| Book subtitle | proceedings of the 2020 ACM SIGIR International Conference on Theory of Information Retrieval : September 14-17, 2020, Virtual Event, Norway |
| ISBN (electronic) |
|
| Event | 6th ACM SIGIR / 10th International Conference on the Theory of Information Retrieval, ICTIR 2020 |
| Pages (from-to) | 129-132 |
| Number of pages | 4 |
| Publisher | New York, NY: The Association for Computing Machinery |
| Organisations |
|
| Abstract |
Recent research on conversational search highlights the importance of mixed-initiative in conversations. To enable mixed-initiative, the system should be able to ask clarifying questions to the user. However, the ability of the underlying ranking models (which support conversational search) to account for these clarifying questions and answers has not been analysed when ranking documents, at large. To this end, we analyse the performance of a lexical ranking model on a conversational search dataset with clarifying questions. We investigate, both quantitatively and qualitatively, how different aspects of clarifying questions and user answers affect the quality of ranking. We argue that there needs to be some fine-grained treatment of the entire conversational round of clarification, based on the explicit feedback which is present in such mixed-initiative settings. Informed by our findings, we introduce a simple heuristic-based lexical baseline, that significantly outperforms the existing naive baselines. Our work aims to enhance our understanding of the challenges present in this particular task and inform the design of more appropriate conversational ranking models. |
| Document type | Conference contribution |
| Note | With supplemental material. |
| Language | English |
| Published at | https://doi.org/10.1145/3409256.3409817 |
| Other links | https://www.scopus.com/pages/publications/85093077305 |
| Downloads |
3409256.3409817
(Final published version)
|
| Supplementary materials | |
| Permalink to this page | |
