Transfer Learning
Definition
A technique where a model trained on one task is reused or adapted for a different but related task, reducing the need for large amounts of task-specific data.
Transfer learning has become the dominant paradigm in modern AI. Instead of training a model from scratch for every task, a model is first pre-trained on a large general dataset and then fine-tuned on a smaller task-specific dataset. This approach dramatically reduces the data and compute required for new tasks. In NLP, models like BERT and GPT are pre-trained on billions of words and then fine-tuned for specific applications. In computer vision, models pre-trained on ImageNet are adapted for medical imaging or satellite analysis. Transfer learning democratized AI by making state-of-the-art performance accessible without massive training budgets.
Related Terms
Few-Shot Learning
The ability of an AI model to learn and generalize from only a handful of examples, rather than requ...
Fine-Tuning
The process of taking a pre-trained model and further training it on a smaller, task-specific datase...
Pre-Training
The initial phase of training a foundation model on a large, general-purpose dataset before it is fi...
Foundation Model
A large AI model trained on broad data at scale that can be adapted to a wide range of downstream ta...