Back to GlossaryTraining

Loss Function

Definition

A mathematical function that measures how far a model's predictions are from the actual target values, providing the signal that drives the training process.

The loss function (also called cost function or objective function) quantifies the model's error and is the quantity that training seeks to minimize. Common loss functions include cross-entropy loss for classification tasks, mean squared error for regression, and various specialized losses for generative models. For language models, the standard loss is cross-entropy over the predicted token distribution versus the actual next token. The choice of loss function significantly impacts what the model learns — for example, using perceptual loss instead of pixel-wise loss for image generation produces more visually pleasing results. Loss values are tracked during training to monitor convergence and detect issues like overfitting or training instability.

Companies in Training

View Training companies →