YoVDO

Knowledge Distillation in Efficient Machine Learning - Lecture 9

Offered By: MIT HAN Lab via YouTube

Tags

Machine Learning Courses Deep Learning Courses Neural Networks Courses Ensemble Learning Courses Transfer Learning Courses Model Optimization Courses Model Compression Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore knowledge distillation techniques in this 58-minute lecture from MIT's EfficientML.ai course (6.5940, Fall 2024). Delve into the principles and applications of knowledge distillation as presented by Prof. Song Han from the MIT HAN Lab. Gain insights into how this technique can be used to transfer knowledge from larger, more complex models to smaller, more efficient ones. Access accompanying slides at efficientml.ai to enhance your understanding of this crucial topic in machine learning optimization.

Syllabus

EfficientML.ai Lecture 9 - Knowledge Distillation (MIT 6.5940, Fall 2024)


Taught by

MIT HAN Lab

Related Courses

Structuring Machine Learning Projects
DeepLearning.AI via Coursera
Natural Language Processing on Google Cloud
Google Cloud via Coursera
Introduction to Learning Transfer and Life Long Learning (3L)
University of California, Irvine via Coursera
Advanced Deployment Scenarios with TensorFlow
DeepLearning.AI via Coursera
Neural Style Transfer with TensorFlow
Coursera Project Network via Coursera