Underfitting
Definition
When a model is too simple to capture the underlying patterns in the data, resulting in poor performance on both training and test data.
Underfitting occurs when a model lacks the capacity or training to learn the relationships present in the data. An underfit model produces high error on both training and validation sets. Common causes include models that are too small, insufficient training time, overly aggressive regularization, or poor feature selection. Solutions include using a more complex model architecture, training for more epochs, reducing regularization, or engineering better features. In the context of modern deep learning, underfitting is less common than overfitting because models tend to be very large. However, it can still occur when fine-tuning with insufficient data or when the learning rate is set inappropriately. Balancing between underfitting and overfitting is a core challenge in model development.
Related Terms
Learning Rate
A hyperparameter that controls how much model weights are adjusted in response to the estimated erro...
Regularization
A set of techniques used to prevent overfitting by adding constraints or penalties to the model duri...
Overfitting
When a model learns the training data too well, including its noise and peculiarities, and fails to ...
Hyperparameter
A configuration value set before training begins that controls the training process itself, as oppos...