Skip to content

Loss Function

Medium — good to knowAI & ML

ELI5 — The Vibe Check

The loss function is the AI's score of how badly it's doing. After every prediction, the loss function computes a number: 0 means perfect, high means terrible. The goal of training is to make this number as small as possible. Common choices include cross-entropy for classification and mean squared error for regression. The loss is what gradient descent is trying to minimize.

Real Talk

A loss function (or cost function) quantifies the discrepancy between model predictions and ground truth labels. It must be differentiable with respect to model parameters to enable backpropagation. Common examples include cross-entropy loss (classification), MSE/MAE (regression), and perplexity (language modeling). Choice of loss function directly shapes model behavior.

When You'll Hear This

"The training loss plateaued — try a different optimizer." / "Cross-entropy loss is standard for classification."

Made with passive-aggressive love by manoga.digital. Powered by Claude.