Underfitting
ELI5 — The Vibe Check
Underfitting is the opposite of overfitting — the model hasn't learned enough and does badly on BOTH the training data AND new data. It's a student who barely studied and fails every test. Your model is too simple, didn't train long enough, or the learning rate is wrong. The fix is usually to train more or use a bigger model.
Real Talk
Underfitting occurs when a model fails to capture the underlying patterns in the training data, resulting in high bias and poor performance on both training and validation sets. Causes include insufficient model capacity, too few training steps, too high a learning rate, or poorly chosen features. It is the opposite failure mode to overfitting.
When You'll Hear This
"The training loss is still high — we're underfitting." / "Use a larger model to fix underfitting."
Related Terms
Bias
In ML, bias means the model has systematic errors — it's consistently wrong in the same direction.
Loss Function
The loss function is the AI's score of how badly it's doing.
Overfitting
Overfitting is when your model gets TOO good at the training data and becomes useless on new data.
Training
Training is the long, expensive process where an AI learns from data.
Variance
Variance in ML means your model is too sensitive to the specific training data it saw.