Parameters
ELI5 — The Vibe Check
Parameters is the technical word for weights — the individual numbers inside an AI model. When someone says 'GPT-4 has 1.7 trillion parameters,' they mean 1.7 trillion individual numbers that define how the model thinks. More parameters generally means more capable, but also more expensive to run. It's how we measure model size.
Real Talk
Parameters are the learnable values in a model — the individual scalar elements of weight matrices and bias vectors. Parameter count is a primary measure of model scale. LLMs are commonly described by their parameter count (e.g., 7B, 70B, 405B). Not all parameters are equally impactful; sparse architectures like MoE activate only a subset per forward pass.
When You'll Hear This
"This is a 7B parameter model — fast and cheap." / "More parameters doesn't always mean better."
Related Terms
Hyperparameter
Hyperparameters are the settings you configure BEFORE training starts — as opposed to parameters (weights) which the model learns ON ITS OWN.
Model
A model is the trained AI — the finished product.
Training
Training is the long, expensive process where an AI learns from data.
Weights
Weights are the numbers inside a neural network that determine what it knows and how it behaves — they're the AI's 'brain cells.