Hallucination
ELI5 — The Vibe Check
When an AI confidently makes something up — like citing a library that doesn't exist or generating code that calls a function that was never written. It sounds right, it looks right, but it's pure fiction. The AI equivalent of that kid in school who presented a book report on a book they obviously never read, but did it with such confidence that nobody questioned it.
Real Talk
In the context of AI-assisted coding, hallucinations occur when models generate syntactically valid but factually incorrect code — referencing non-existent APIs, fabricating library methods, or producing plausible but broken logic. This is a key risk in vibe coding and requires human review to catch.
When You'll Hear This
"The AI hallucinated a function that doesn't exist in that library." / "Always verify AI-generated code — hallucinations look real until they break."
Related Terms
Code Review
A code review is when another developer reads your code before it gets merged, looking for bugs, bad practices, or anything confusing.
Grounding
Grounding is giving the AI real, verified information to base its answers on — so it doesn't just make stuff up.
Guardrails
Guardrails are the safety nets you put around AI applications — rules and checks that prevent the AI from going rogue.
RAG (Retrieval Augmented Generation)
RAG is how you give an AI access to your private documents without retraining it.
Vibe Coding
Vibe coding is writing software by vibes — you describe what you want to an AI, it writes the code, and you just... vibe.