Knowledge Distillation - Graph View A model compression technique where a smaller student model is trained to reproduce the behavior and outputs of a larger, more capable teacher model. View concept details Related ConceptsAI Inference Model Quantization Model Pruning Deep Learning Neural Networks Fine-Tuning Edge AI ← Back to full graph