Back to GlossaryTraining

Underfitting

Definition

When a model is too simple to capture the underlying patterns in the data, resulting in poor performance on both training and test data.

Underfitting occurs when a model lacks the capacity or training to learn the relationships present in the data. An underfit model produces high error on both training and validation sets. Common causes include models that are too small, insufficient training time, overly aggressive regularization, or poor feature selection. Solutions include using a more complex model architecture, training for more epochs, reducing regularization, or engineering better features. In the context of modern deep learning, underfitting is less common than overfitting because models tend to be very large. However, it can still occur when fine-tuning with insufficient data or when the learning rate is set inappropriately. Balancing between underfitting and overfitting is a core challenge in model development.

Companies in Training

View Training companies →