Confabulation
ELI5 — The Vibe Check
Confabulation is when an AI invents a plausible-sounding answer and delivers it with total confidence, when the honest answer would be 'I don't know.' It's not lying — the model genuinely can't tell it's making things up. It's like a very confident hallucination.
Real Talk
Confabulation, in AI context, refers to the generation of plausible-sounding but fabricated information, often indistinguishable in tone from accurate responses. Rooted in models' training to produce fluent outputs regardless of knowledge certainty. Mitigations include retrieval augmentation, abstention training, and calibrated uncertainty. Distinct from hallucination mostly in framing — confabulation emphasizes the confidence dimension.
When You'll Hear This
"Model confabulated a citation for a paper that doesn't exist." / "Frontier models reduce confabulation but don't eliminate it."
Related Terms
Hallucination
When an AI confidently makes something up — like citing a library that doesn't exist or generating code that calls a function that was never written.
Hallucination Drift
Hallucination drift is when an AI starts with one small made-up fact, then builds on it, and by the end the entire conversation is based on fiction.
RAG (Retrieval Augmented Generation)
RAG is how you give an AI access to your private documents without retraining it.