Do Instruction-tuned Large Language Models Help with Relation Extraction?
| Authors | |
|---|---|
| Publication date | 2023 |
| Host editors |
|
| Book title | Joint proceedings of the 1st workshop on Knowledge Base Construction from Pre-Trained Language Models (KBC-LM) and the 2nd challenge on Language Models for Knowledge Base Construction (LM-KBC) |
| Book subtitle | co-located with the 22nd International Semantic Web Conference (ISWC 2023) : Athens, Greece, November 6, 2023 |
| Series | CEUR Workshop Proceedings |
| Event | 1st Workshop on Knowledge Base Construction from Pre-Trained Language Models and the 2nd Challenge on Language Models for Knowledge Base Construction, KBC-LM + LM-KBC 2023 |
| Article number | 15 |
| Number of pages | 7 |
| Publisher | Aachen: CEUR-WS |
| Organisations |
|
| Abstract |
Information extraction and specifically relation extraction are key tasks in knowledge base construction. With in-context learning, Large Language Models (LLMs) often demonstrate impressive generalization on unseen information extraction tasks, even with limited examples. However, when using in-context learning for relation extraction, LLMs are not competitive with fully supervised baselines that employ smaller language models. To address this, we explore the potential of instruction-tuning as a mechanism to improve relation extraction performance while preserving in-context capabilities. Our preliminary results demonstrate that instruction-tuned LLMs have the potential to achieve comparable performance with fully supervised smaller LMs. We instruction-tuned a Dolly-v2-3B model using the parameter-efficient approach LoRA on a challenging silver standard relation extraction dataset comprising 1,079 relations. Results show that the instruction-tuned model can achieve a 28.5 micro-F1 and a 27.3 macro-F1 score under a strict matching evaluation strategy. Additionally, manual evaluation with two evaluators shows an average of 66.5% accuracy with 0.760 inter-agreement. You can find access to code and dataset at https://github.com/INDElab/KGC-LLM.git .
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://ceur-ws.org/Vol-3577/paper15.pdf |
| Other links | https://ceur-ws.org/Vol-3577/ |
| Downloads |
paper15
(Final published version)
|
| Permalink to this page | |
