YoVDO

Pruning and Sparsity in Machine Learning - Part I

Offered By: MIT HAN Lab via YouTube

Tags

Machine Learning Courses Model Compression Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the fundamentals of pruning and sparsity in machine learning through this comprehensive lecture from MIT's 6.5940 course. Delve into the first part of a two-part series on pruning techniques and sparse neural networks, guided by Professor Song Han. Learn about the importance of model compression, various pruning methods, and their impact on neural network efficiency. Gain insights into state-of-the-art approaches for reducing model size while maintaining performance. Access accompanying slides from the EfficientML.ai website to enhance your understanding of these critical concepts in efficient machine learning.

Syllabus

EfficientML.ai Lecture 3 - Pruning and Sparsity Part I (MIT 6.5940, Fall 2024)


Taught by

MIT HAN Lab

Related Courses

TensorFlow Lite for Edge Devices - Tutorial
freeCodeCamp
Few-Shot Learning in Production
HuggingFace via YouTube
TinyML Talks Germany - Neural Network Framework Using Emerging Technologies for Screening Diabetic
tinyML via YouTube
TinyML for All: Full-stack Optimization for Diverse Edge AI Platforms
tinyML via YouTube
TinyML Talks - Software-Hardware Co-design for Tiny AI Systems
tinyML via YouTube