The Enemy of Production AI
Hallucinations are when an LLM generates plausible-sounding but false information. In production, this destroys user trust.
Hallucination Categories
| Type | Example | Severity |
|---|---|---|
| Factual | "Your order shipped yesterday" (it didn't) | Critical |
| Entity | Inventing a product that doesn't exist | High |
| Temporal | Wrong dates, delivery estimates | High |
| Numeric | Wrong prices, quantities | Critical |
| Policy | Making up return policies | Critical |