Back to GlossaryCore Concepts

Transfer Learning

Definition

A technique where a model trained on one task is reused or adapted for a different but related task, reducing the need for large amounts of task-specific data.

Transfer learning has become the dominant paradigm in modern AI. Instead of training a model from scratch for every task, a model is first pre-trained on a large general dataset and then fine-tuned on a smaller task-specific dataset. This approach dramatically reduces the data and compute required for new tasks. In NLP, models like BERT and GPT are pre-trained on billions of words and then fine-tuned for specific applications. In computer vision, models pre-trained on ImageNet are adapted for medical imaging or satellite analysis. Transfer learning democratized AI by making state-of-the-art performance accessible without massive training budgets.

Companies in Core Concepts

View Core Concepts companies →