Grounding
ELI5 — The Vibe Check
Grounding is giving the AI real, verified information to base its answers on — so it doesn't just make stuff up. It's like the difference between writing an essay from memory (hallucination city) and writing one with the textbook open in front of you. You feed the model actual documents, databases, or search results, and it uses those as its source of truth.
Real Talk
Grounding connects LLM outputs to verifiable external data sources, reducing hallucinations and improving factual accuracy. Techniques include RAG (retrieving relevant documents), tool use (querying databases/APIs), knowledge graph integration, and citation generation. Grounded systems can attribute claims to specific sources, enabling verification.
When You'll Hear This
"Ground the model's responses in our documentation to prevent hallucinations." / "The grounded search feature cites actual sources for every claim."
Related Terms
Embedding
An embedding is turning words, sentences, or entire documents into lists of numbers (vectors) that capture their meaning.
Hallucination
When an AI confidently makes something up — like citing a library that doesn't exist or generating code that calls a function that was never written.
RAG (Retrieval Augmented Generation)
RAG is how you give an AI access to your private documents without retraining it.
Tool Use
Tool use is when an AI can call external functions, APIs, or programs to do things it can't do alone.