In-Context Learning
ELI5 — The Vibe Check
In-context learning is the AI's ability to learn new tricks just from what you put in the prompt — without changing any of its actual brain weights. It's like a student who gets better at a task just by reading the instructions more carefully. You stuff examples, rules, or context into the prompt and the model adapts its behavior accordingly. The weights don't change, but the output does.
Real Talk
In-context learning (ICL) is the phenomenon where large language models adapt their behavior based on information provided in the prompt at inference time. Unlike fine-tuning, ICL requires no gradient updates — the model extracts patterns from the prompt context. It encompasses both few-shot examples and instruction following, and is a key emergent capability of large-scale transformers.
When You'll Hear This
"We used in-context learning instead of fine-tuning — way faster to iterate." / "The model picks up the output format through in-context learning."
Related Terms
Context Window
A context window is how much text an AI can 'see' at once — its working memory.
Few-Shot Learning
Few-shot learning is teaching an AI by showing it just a few examples — like showing someone two pictures of a platypus and saying 'find more of these.
LLM (Large Language Model)
An LLM is a humongous AI that read basically the entire internet and learned to predict what words come next, really really well.
Prompt Engineering
Prompt engineering is the art of talking to AI so it actually does what you want.