What is Hallucination?
TL;DR
The phenomenon where AI generates plausible-sounding but factually incorrect information. The most critical concern when using AI.
Hallucination: Definition & Explanation
Hallucination refers to the phenomenon where an AI model generates information that sounds convincing but is not grounded in fact. Because LLMs generate text based on statistical patterns rather than verified knowledge, they can produce nonexistent facts, fabricated citations, or incorrect figures. For example, an LLM might cite a research paper that does not exist or reference a law that was never enacted. Countermeasures include using RAG (Retrieval-Augmented Generation), verifying sources, adjusting the temperature parameter, and performing fact-checks. AI search tools like Perplexity AI, which provide source citations with their answers, are particularly effective at mitigating hallucination risks.