Techniques
Knowledge Distillation
Definition
“
A technique where a smaller 'student' model is trained to replicate the behavior of a larger 'teacher' model, enabling deployment on edge devices. Knowledge distillation transfers the learned representations of massive models into compact architectures that are faster and cheaper to run in production.
”
Related Terms
No related terms linked yet.
Explore all terms →