Skip to content

In-Context Learning

Medium — good to knowAI & ML

ELI5 — The Vibe Check

In-context learning is the AI's ability to learn new tricks just from what you put in the prompt — without changing any of its actual brain weights. It's like a student who gets better at a task just by reading the instructions more carefully. You stuff examples, rules, or context into the prompt and the model adapts its behavior accordingly. The weights don't change, but the output does.

Real Talk

In-context learning (ICL) is the phenomenon where large language models adapt their behavior based on information provided in the prompt at inference time. Unlike fine-tuning, ICL requires no gradient updates — the model extracts patterns from the prompt context. It encompasses both few-shot examples and instruction following, and is a key emergent capability of large-scale transformers.

When You'll Hear This

"We used in-context learning instead of fine-tuning — way faster to iterate." / "The model picks up the output format through in-context learning."

Made with passive-aggressive love by manoga.digital. Powered by Claude.