Skip to content

Batch

Medium — good to knowAI & ML

ELI5 — The Vibe Check

A batch is a small group of training examples that the model processes at once before updating its weights. Instead of learning from examples one at a time (slow) or all at once (doesn't fit in memory), you feed it in batches. Batch size is a hyperparameter — bigger batches are faster but sometimes learn worse. AI loves batches.

Real Talk

A batch is a subset of training examples processed together in a single forward and backward pass. Batch size is a key hyperparameter: larger batches provide more stable gradient estimates but require more memory and can hurt generalization; smaller batches introduce noise that can be regularizing. Mini-batch gradient descent is the standard training approach.

When You'll Hear This

"Set batch size to 32." / "Out of memory — reduce the batch size."

Made with passive-aggressive love by manoga.digital. Powered by Claude.