Improving Graph-to-Text Generation Using Cycle Training
| Authors | |
|---|---|
| Publication date | 08-2023 |
| Host editors |
|
| Book title | Language, data and knowledge 2023 |
| Book subtitle | LDK 2023 : proceedings of the 4th Conference on Language, Data and Knowledge : 12-15 September 2023, Vienna, Austria |
| ISBN (electronic) |
|
| Event | 4th Conference on Language, Data and Knowledge |
| Pages (from-to) | 256-261 |
| Publisher | Lisboa: NOVA CLUNL |
| Organisations |
|
| Abstract |
Natural Language Generation (NLG) from graph structured data is an important step for a number of tasks, including e.g. generating explanations, automated reporting, and conversational interfaces. Large generative language models are currently the state of the art for open ended NLG for graph data. However, these models can produce erroneous text (termed hallucinations). In this paper, we investigate the application of {\em cycle training} in order to reduce these errors. Cycle training involves alternating the generation of text from an input graph with the extraction of a knowledge graph where the model should ensure consistency between the extracted graph and the input graph. Our results show that cycle training improves performance on evaluation metrics (e.g., METEOR, DAE) that consider syntactic and semantic relations, and more in generally, that cycle training is useful to reduce erroneous output when generating text from graphs.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.34619/srmk-injj |
| Published at | https://aclanthology.org/2023.ldk-1.24 |
| Downloads |
2023.ldk-1.24
(Final published version)
|
| Permalink to this page | |
