What Comes Next? Evaluating Uncertainty in Neural Text Generators Against Human Production Variability

Open Access
Authors
  • B. Plank
Publication date 2023
Host editors
  • H. Bouamar
  • J. Pino
  • K. Bali
Book title The 2023 Conference on Empirical Methods in Natural Language Processing
Book subtitle EMNLP 2023 : Proceedings of the Conference : December 6-10, 2023
ISBN (electronic)
  • 9798891760608
Event 2023 Conference on Empirical Methods in Natural Language Processing
Pages (from-to) 14349–14371
Publisher Stroudsburg, PA: Association for Computational Linguistics
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
In Natural Language Generation (NLG) tasks, for any input, multiple communicative goals are plausible, and any goal can be put into words, or produced, in multiple ways. We characterise the extent to which human production varies lexically, syntactically, and semantically across four NLG tasks, connecting human production variability to aleatoric or data uncertainty. We then inspect the space of output strings shaped by a generation system’s predicted probability distribution and decoding algorithm to probe its uncertainty. For each test input, we measure the generator’s calibration to human production variability. Following this instance-level approach, we analyse NLG models and decoding strategies, demonstrating that probing a generator with multiple samples and, when possible, multiple references, provides the level of detail necessary to gain understanding of a model’s representation of uncertainty.
Document type Conference contribution
Note With supplementary video
Language English
Related dataset whatsnext-scores
Published at https://doi.org/10.18653/v1/2023.emnlp-main.887
Other links https://github.com/dmg-illc/nlg-uncertainty-probes
Downloads
2023.emnlp-main.887 (Final published version)
Supplementary materials
Permalink to this page
Back