By clicking "Accept", you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. See our Privacy Policy for more information
Glossary
Knowledge Distillation
AI DEFINITION

Knowledge Distillation

A process in which a smaller AI model (student) is trained to replicate the predictions of a larger, more complex model (professor), reducing size and computing needs while maintaining performance.