Few-Shot Learning
Definition
The ability of an AI model to learn and generalize from only a handful of examples, rather than requiring thousands or millions of training samples.
Few-shot learning enables models to adapt to new tasks with very limited data. In the context of large language models, few-shot learning refers to providing a small number of examples in the prompt to guide the model's behavior (also called in-context learning). For example, showing a model three examples of sentiment classification allows it to classify new text without any weight updates. This capability emerged as models scaled up in size and training data. Few-shot learning is closely related to meta-learning, where models learn how to learn efficiently. It is especially valuable in domains where labeled data is scarce or expensive to obtain.
Related Terms
Large Language Model
A neural network with billions of parameters trained on massive text datasets, capable of understand...
Prompt Engineering
The practice of crafting effective instructions and inputs for AI models to elicit desired outputs, ...
Transfer Learning
A technique where a model trained on one task is reused or adapted for a different but related task,...
Zero-Shot Learning
The ability of a model to perform a task it has never been explicitly trained on, without any task-s...