Back to GlossaryTraining

Epoch

Definition

One complete pass through the entire training dataset during model training.

An epoch represents the model seeing every example in the training set exactly once. Training typically requires multiple epochs for the model to converge — simple models might need 10-100 epochs, while some deep learning tasks require hundreds. However, for large language model pre-training, models often train for less than one epoch on their massive datasets, as seeing each document once may be sufficient (and multiple epochs on internet data can cause memorization). Monitoring performance across epochs helps detect overfitting (when validation loss starts increasing while training loss continues to decrease). Early stopping — halting training when validation performance stops improving — is a common regularization technique that depends on tracking epoch-level metrics.

Companies in Training

View Training companies →