YoVDO

Pruning and Sparsity in Neural Networks - Lecture 4

Offered By: MIT HAN Lab via YouTube

Tags

TinyML Courses Fine-Tuning Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive deeper into neural network pruning and sparsity in this lecture from MIT's course on TinyML and Efficient Deep Learning Computing. Explore advanced pruning techniques, including how to select optimal pruning ratios for each layer and fine-tune sparse neural networks. Discover the lottery ticket hypothesis and learn about system support for sparsity. Gain valuable insights into making deep learning models more efficient and deployable on resource-constrained devices. Access accompanying slides and additional course materials to enhance your understanding of pruning, sensitivity scans, automatic pruning, and the AMC algorithm.

Syllabus

Lecture 04 - Pruning and Sparsity (Part II) | MIT 6.S965


Taught by

MIT HAN Lab

Related Courses

TensorFlow: Working with NLP
LinkedIn Learning
Introduction to Video Editing - Video Editing Tutorials
Great Learning via YouTube
HuggingFace Crash Course - Sentiment Analysis, Model Hub, Fine Tuning
Python Engineer via YouTube
GPT3 and Finetuning the Core Objective Functions - A Deep Dive
David Shapiro ~ AI via YouTube
How to Build a Q&A AI in Python - Open-Domain Question-Answering
James Briggs via YouTube