Knowledge Distillation in Efficient Machine Learning - Lecture 9
Offered By: MIT HAN Lab via YouTube
Course Description
Overview
Explore knowledge distillation techniques in this 58-minute lecture from MIT's EfficientML.ai course (6.5940, Fall 2024). Delve into the principles and applications of knowledge distillation as presented by Prof. Song Han from the MIT HAN Lab. Gain insights into how this technique can be used to transfer knowledge from larger, more complex models to smaller, more efficient ones. Access accompanying slides at efficientml.ai to enhance your understanding of this crucial topic in machine learning optimization.
Syllabus
EfficientML.ai Lecture 9 - Knowledge Distillation (MIT 6.5940, Fall 2024)
Taught by
MIT HAN Lab
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX