YoVDO

Knowledge Distillation in Efficient Machine Learning - Lecture 9

Offered By: MIT HAN Lab via YouTube

Tags

Machine Learning Courses Neural Networks Courses Ensemble Learning Courses Transfer Learning Courses Model Compression Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore knowledge distillation techniques in this lecture from MIT's EfficientML.ai course (6.5940, Fall 2023). Delve into the principles and applications of knowledge distillation, a key method for model compression and efficiency in machine learning. Learn from Professor Song Han as he presents the concepts through a comprehensive Zoom recording. Access accompanying slides at efficientml.ai to enhance your understanding of this advanced topic in efficient machine learning.

Syllabus

EfficientML.ai Lecture 9 - Knowledge Distillation (MIT 6.5940, Fall 2023, Zoom)


Taught by

MIT HAN Lab

Related Courses

Machine Learning 1—Supervised Learning
Brown University via Udacity
Data Mining: Theories and Algorithms for Tackling Big Data | 数据挖掘:理论与算法
Tsinghua University via edX
Big Data Applications: Machine Learning at Scale
Yandex via Coursera
Data Analytics Foundations for Accountancy II
University of Illinois at Urbana-Champaign via Coursera
PyCaret: Anatomy of Classification
Coursera Project Network via Coursera