Skip to content

Underfitting

Medium — good to knowAI & ML

ELI5 — The Vibe Check

Underfitting is the opposite of overfitting — the model hasn't learned enough and does badly on BOTH the training data AND new data. It's a student who barely studied and fails every test. Your model is too simple, didn't train long enough, or the learning rate is wrong. The fix is usually to train more or use a bigger model.

Real Talk

Underfitting occurs when a model fails to capture the underlying patterns in the training data, resulting in high bias and poor performance on both training and validation sets. Causes include insufficient model capacity, too few training steps, too high a learning rate, or poorly chosen features. It is the opposite failure mode to overfitting.

When You'll Hear This

"The training loss is still high — we're underfitting." / "Use a larger model to fix underfitting."

Made with passive-aggressive love by manoga.digital. Powered by Claude.