Skip to content

Weights

Medium — good to knowAI & ML

ELI5 — The Vibe Check

Weights are the numbers inside a neural network that determine what it knows and how it behaves — they're the AI's 'brain cells.' When you train a model, you're adjusting billions of these weights until the model gets good at its task. When someone shares 'open weights,' they're sharing the actual learned knowledge of the model, not just the architecture.

Real Talk

Weights (parameters) are the learnable numerical values in a neural network that transform inputs to outputs. During training, weights are adjusted via backpropagation and gradient descent to minimize the loss function. A model's weight count (parameters) is a rough indicator of capacity — GPT-4 is estimated at 1.8T parameters. 'Open weights' means the trained parameters are publicly released.

When You'll Hear This

"The model has 70 billion weights — needs 140GB in FP16." / "Download the weights from Hugging Face and run locally."

Made with passive-aggressive love by manoga.digital. Powered by Claude.