YoVDO

SynFlow - Pruning Neural Networks Without Any Data by Iteratively Conserving Synaptic Flow

Offered By: Yannic Kilcher via YouTube

Tags

Neural Networks Courses Deep Learning Courses Algorithm Design Courses

Course Description

Overview

Explore a 45-minute video lecture on SynFlow, a groundbreaking algorithm for pruning neural networks without using any data. Delve into the concept of the Lottery Ticket Hypothesis and understand why previous pruning attempts have failed. Learn about layer collapse, synaptic saliency conservation, and how iterative pruning can avoid these issues. Discover the SynFlow algorithm, which achieves maximum compression capacity by conserving synaptic flow. Examine experimental results and gain insights into this data-agnostic approach that challenges the notion that data is necessary to determine important synapses in neural networks.

Syllabus

- Intro & Overview
- Pruning Neural Networks
- Lottery Ticket Hypothesis
- Paper Story Overview
- Layer Collapse
- Synaptic Saliency Conservation
- Connecting Layer Collapse & Saliency Conservation
- Iterative Pruning avoids Layer Collapse
- The SynFlow Algorithm
- Experiments
- Conclusion & Comments


Taught by

Yannic Kilcher

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera
Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera
Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera
Leading Ambitious Teaching and Learning
Microsoft via edX