The Linux Foundation Projects
Skip to main content

Hallucination

An incorrect or unexpected AI response. Such responses can include generative AI answers that are incorrect but stated with confidence as if correct. In applications in which accuracy is critical, hallucinations can, of course, be problematic. However, in applications in which AI is employed to generate new content, hallucination-like output may prove beneficial since it can provide the user with a novel perspective.