Epoch
ELI5 — The Vibe Check
An epoch is one complete pass through your entire training dataset. If you have 100,000 examples, one epoch means the model has seen all 100,000 once. Models usually need many epochs to learn well, but too many epochs and they start memorizing instead of learning (overfitting). It's like rereading a textbook — helpful up to a point.
Real Talk
An epoch is one complete iteration over the entire training dataset. A training run typically consists of multiple epochs. Each epoch is divided into batches. Validation metrics are usually computed after each epoch to monitor overfitting. Too few epochs leads to underfitting; too many to overfitting.
When You'll Hear This
"Train for 10 epochs and check validation loss." / "After epoch 5 the model started overfitting."
Related Terms
Batch
A batch is a small group of training examples that the model processes at once before updating its weights.
Hyperparameter
Hyperparameters are the settings you configure BEFORE training starts — as opposed to parameters (weights) which the model learns ON ITS OWN.
Loss Function
The loss function is the AI's score of how badly it's doing.
Overfitting
Overfitting is when your model gets TOO good at the training data and becomes useless on new data.
Training
Training is the long, expensive process where an AI learns from data.