Hallucination refers to generating text that is not 'faithful' to the source. In most cases, the hallucination occurs due to divergence between the source and reference. For example, in the context of image captioning, hallucination can be defined as generating captions that contain descriptions not present in the given image.

Also, hallucination has been observed when the system catches on to wrong correlations between different training data parts. In general higher predictive uncertainty corresponds to a higher chance of hallucination and epistemic uncertainty is more indicative of hallucination than aleatoric or total uncertainties. This is the case as the common theme across all the hallucination explanations in conditional NLG tasks is predictive uncertainty.

For example in the case of image captioning, it was shown that at higher uncertainty levels the generated objects are more likely to be hallucinated and this correlation is consistent across data-to-text generation, abstractive summarization and neural machine translation (NMT) as well.

Covers: theory of Hallucination in NLG
Estimated time needed to finish: 2 minutes
Questions this item addresses:
  • So, overall, how does hallucination in NLG occur?
0 comment
Recipe
publicShare
Star(0)

Understanding the paper: On Hallucination and Predictive Uncertainty in Conditional Language Generation

Contributors
Total time needed: ~2 hours
Objectives
Understanding the On Hallucination and Predictive Uncertainty in Conditional Language Generation paper.
Potential Use Cases
Neural natural language generation: image captioning, data-to-text generation, abstractive summarization, and neural machine translation.
Who is This For ?
INTERMEDIATENatural Language Processing (NLP) developers looking to better understand and correct for hallucination in a variety of Natural Language Generation tasks.
Click on each of the following annotated items to see details.
Resource Asset5/10
ARTICLE 1. Hallucination in Neural NLG
  • What is hallucination in Neural NLG?
  • What are some examples of hallucination in Neural NLG?
  • Why is hallucination unacceptable in many NLG applications?
5 minutes
PAPER 2. Hallucination in Image Captioning - Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4035–4045
  • How do hallucinations occur during image captioning?
  • Why do hallucinations occur during image captioning?
12 minutes
PAPER 3. Challenges in Data-to-Document Generation
  • Why do hallucinations occur during data-to-text generation?
  • How do hallucinations occur during data-to-text generation?
4 minutes
PAPER 4. Hallucination in Abstractive Summarization - FEQA: A Question Answering Evaluation Framework for Faithfulness Assessment in Abstractive Summarization
  • Why do hallucinations occur during abstractive summarization?
  • How do hallucinations occur during abstractive summarization?
15 minutes
PAPER 5. Domain Robustness in Neural Machine Translation
  • Why do hallucinations occur during Neural machine translation (NMT)?
  • Why do hallucinations occur during Neural machine translation (NMT)?
6 minutes
PAPER 6. Aleatoric and epistemic uncertainty in machine learning: an introduction to concepts and methods
  • What is aleatoric uncertainty?
  • What is epistemic uncertainty?
  • How are aleatoric uncertainty and epistemic uncertainty different?
10 minutes
WRITEUP 7. Therefore, How Hallucination Occurs
  • So, overall, how does hallucination in NLG occur?
2 minutes
WRITEUP 8. Uncertainty-Aware Beam Search (UABS) from the Paper "On Hallucination and Predictive Uncertainty in Conditional Language Generation
  • What is Uncertainty-Aware Beam Search (UABS)?
  • How does UABS work?
  • How well does UABS work compared to regular model?
8 minutes
PAPER 9. Further Research: Deep and Confident Prediction for Time Series at Uber
  • How does uncertainty apply to other domains?
6 minutes
PAPER 10. Quantifying Uncertainties in Natural Language Processing Tasks
  • How does uncertainty apply to other domains within NLP?
5 minutes

Concepts Covered

0 comment