Overview of the CLEF 2024 SimpleText Task 3: Simplify Scientific Text
| Authors |
|
|---|---|
| Publication date | 2024 |
| Host editors |
|
| Book title | Working Notes of the Conference and Labs of the Evaluation Forum (CLEF 2024) |
| Book subtitle | Grenoble, France, 9-12 September, 2024 |
| Series | CEUR Workshop Proceedings |
| Event | 2024 Conference and Labs of the Evaluation Forum |
| Article number | 307 |
| Pages (from-to) | 3147-3162 |
| Number of pages | 16 |
| Publisher | Aachen: CEUR-WS |
| Organisations |
|
| Abstract |
This article provides a comprehensive summary of the CLEF 2024 SimpleText Task 3, which focuses on simplifying scientific text based on specific queries. We discuss in detail the motivation for lay access to scholarly literature, and provide an overview of the setup of the scientific text simplification task. One of the main innovations of the CLEF 2024 SimpleText Task 3 is to complement sentence-level text simplification with a document-level text simplification task. We describe the resulting sentence-level and document-level text simplification test collection in detail, which consists of a corpus of over 1,500 paired source and reference sentences, and a corpus of over 250 paired source and reference abstracts, both containing the source text from scientific abstracts with direct reference simplifications produced by human annotators. We present the results of the participants submission, with 15 teams submitting 52 sentence-level text simplification runs and 9 teams submitting 31 sentence-level text simplification runs. The article concludes with an in-depth analysis, including information distortion and potential LLM “hallucinations” of the simplified sentences submitted by participants.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://ceur-ws.org/Vol-3740/paper-307.pdf |
| Other links | https://ceur-ws.org/Vol-3740/ |
| Downloads |
paper-307
(Final published version)
|
| Permalink to this page | |
