Skip to content

Parameters

Medium — good to knowAI & ML

ELI5 — The Vibe Check

Parameters is the technical word for weights — the individual numbers inside an AI model. When someone says 'GPT-4 has 1.7 trillion parameters,' they mean 1.7 trillion individual numbers that define how the model thinks. More parameters generally means more capable, but also more expensive to run. It's how we measure model size.

Real Talk

Parameters are the learnable values in a model — the individual scalar elements of weight matrices and bias vectors. Parameter count is a primary measure of model scale. LLMs are commonly described by their parameter count (e.g., 7B, 70B, 405B). Not all parameters are equally impactful; sparse architectures like MoE activate only a subset per forward pass.

When You'll Hear This

"This is a 7B parameter model — fast and cheap." / "More parameters doesn't always mean better."

Made with passive-aggressive love by manoga.digital. Powered by Claude.