Context Compaction
ELI5 — The Vibe Check
Context compaction is summarizing a long AI conversation down to just the important bits so the model can keep going without hitting context limits. Like making a cliff-notes version of your chat history so the AI doesn't lose the plot.
Real Talk
Context compaction is the automated or manual summarization of prior conversation context to reduce token count while preserving essential state. Implementations range from naive (keep last N turns) to sophisticated (semantic summarization with preserved facts, decisions, and file references). Claude Code, Cursor, and most agent frameworks implement some form of compaction.
When You'll Hear This
"Hit 900k tokens — time for context compaction." / "Our agent loops automatically compact every 50k tokens."
Related Terms
Context Rot
Context rot is when a long AI conversation has been running so long that the model is confusing old instructions with new ones, forgetting what it already...
Context Window
A context window is how much text an AI can 'see' at once — its working memory.
Prompt Compression
Prompt compression is shrinking a prompt so it fits more context or costs less, without losing meaning.