Skip to content

Variance

Medium — good to knowAI & ML

ELI5 — The Vibe Check

Variance in ML means your model is too sensitive to the specific training data it saw. Change the training data a little and the model changes a lot — it learned all the noise. High variance usually means overfitting. The bias-variance tradeoff is the fundamental tension in ML: simple models have high bias, complex models have high variance.

Real Talk

Variance in the bias-variance tradeoff refers to the model's sensitivity to fluctuations in the training data. High variance indicates overfitting: the model captures noise rather than signal, and small changes in training data produce large changes in model behavior. Regularization, ensemble methods, and more data help reduce variance.

When You'll Hear This

"High variance means the model changes a lot across different training splits." / "We're in the high-variance regime — add dropout."

Made with passive-aggressive love by manoga.digital. Powered by Claude.