Hallucination NLP
noun
Definition: In NLP and large language model research, hallucination refers to generated content that is nonsensical or unfaithful to the provided source or real-world facts. A widely cited survey defines hallucination in natural language generation as text that is “nonsensical or unfaithful to the provided source content” [Ji et al. 2024].
Example in context: “As LLMs gain widespread application, they tend to produce hallucinations – grammatically coherent but factually inaccurate content – posing substantial challenges.” [Sibaee et al. 2024]
Related terms: factuality error, groundedness, faithfulness, confabulation, misinformation, hallucinated content