How Different are Pre-trained Transformers for Text Ranking?
| Authors | |
|---|---|
| Publication date | 2022 |
| Host editors |
|
| Book title | Advances in Information Retrieval |
| Book subtitle | 44th European Conference on IR Research, ECIR 2022, Stavanger, Norway, April 10–14, 2022 : proceedings |
| ISBN |
|
| ISBN (electronic) |
|
| Series | Lecture Notes in Computer Science |
| Event | 44th European Conference on Information Retrieval, ECIR 2022 |
| Volume | Issue number | II |
| Pages (from-to) | 207-214 |
| Number of pages | 8 |
| Publisher | Cham: Springer |
| Organisations |
|
| Abstract |
In recent years, large pre-trained transformers have led to substantial gains in performance over traditional retrieval models and feedback approaches. However, these results are primarily based on the MS Marco/TREC Deep Learning Track setup, with its very particular setup, and our understanding of why and how these models work better is fragmented at best. We analyze effective BERT-based cross-encoders versus traditional BM25 ranking for the passage retrieval task where the largest gains have been observed, and investigate two main questions. On the one hand, what is similar? To what extent does the neural ranker already encompass the capacity of traditional rankers? Is the gain in performance due to a better ranking of the same documents (prioritizing precision)? On the other hand, what is different? Can it retrieve effectively documents missed by traditional systems (prioritizing recall)? We discover substantial differences in the notion of relevance identifying strengths and weaknesses of BERT that may inspire research for future improvement. Our results contribute to our understanding of (black-box) neural rankers relative to (well-understood) traditional rankers, help understand the particular experimental setting of MS-Marco-based test collections. |
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.1007/978-3-030-99739-7_24 |
| Other links | https://www.scopus.com/pages/publications/85128762425 |
| Downloads |
978-3-030-99739-7_24
(Final published version)
|
| Permalink to this page | |
