Context Window
ELI5 — The Vibe Check
A context window is how much text an AI can 'see' at once — its working memory. A small context window is like reading a book one page at a time and forgetting the previous pages. A large context window (200K+ tokens) means the AI can read your entire codebase at once. It's why Claude can understand your whole project while smaller models lose the plot after a few files.
Real Talk
The context window is the maximum number of tokens an LLM can process in a single interaction, encompassing both input (system prompt, conversation history, code) and output. Larger context windows (128K-1M+ tokens) enable processing entire codebases, long documents, and extended conversations. Context window size directly impacts AI coding assistant effectiveness.
When You'll Hear This
"Claude's 200K context window means it can see the entire monorepo." / "We're hitting the context window limit — need to chunk the documents."
Related Terms
LLM (Large Language Model)
An LLM is a humongous AI that read basically the entire internet and learned to predict what words come next, really really well.
Prompt Engineering
Prompt engineering is the art of talking to AI so it actually does what you want.
Token
In AI-land, a token is a chunk of text — roughly 3/4 of a word.
Tokenizer
A tokenizer chops text into pieces that the AI model can understand — but not in ways humans would expect.