Variance
ELI5 — The Vibe Check
Variance in ML means your model is too sensitive to the specific training data it saw. Change the training data a little and the model changes a lot — it learned all the noise. High variance usually means overfitting. The bias-variance tradeoff is the fundamental tension in ML: simple models have high bias, complex models have high variance.
Real Talk
Variance in the bias-variance tradeoff refers to the model's sensitivity to fluctuations in the training data. High variance indicates overfitting: the model captures noise rather than signal, and small changes in training data produce large changes in model behavior. Regularization, ensemble methods, and more data help reduce variance.
When You'll Hear This
"High variance means the model changes a lot across different training splits." / "We're in the high-variance regime — add dropout."
Related Terms
Bias
In ML, bias means the model has systematic errors — it's consistently wrong in the same direction.
Overfitting
Overfitting is when your model gets TOO good at the training data and becomes useless on new data.
Underfitting
Underfitting is the opposite of overfitting — the model hasn't learned enough and does badly on BOTH the training data AND new data.