Context Stuffing
ELI5 — The Vibe Check
Context stuffing is cramming your entire codebase, README, package.json, and your cat's birthday into the AI's context window hoping it'll magically understand everything. More context = better results, right? Not always. Sometimes you overfeed the AI and it gets confused. It's like giving someone a 500-page briefing document for a 5-minute meeting.
Real Talk
Context stuffing refers to the practice of loading excessive amounts of code, documentation, or data into an LLM's context window in an attempt to improve output quality. While larger context windows enable this, there are diminishing returns — models may lose focus on the actual task, hallucinate from irrelevant information, or hit quality degradation at the edges of their context capacity.
When You'll Hear This
"I context-stuffed the entire monorepo and Claude still couldn't find the bug." / "Don't context-stuff — give it only what's relevant."
Related Terms
Context Window
A context window is how much text an AI can 'see' at once — its working memory.
Prompt Engineering
Prompt engineering is the art of talking to AI so it actually does what you want.
RAG (Retrieval Augmented Generation)
RAG is how you give an AI access to your private documents without retraining it.
Token
In AI-land, a token is a chunk of text — roughly 3/4 of a word.