SynFlow - Pruning Neural Networks Without Any Data by Iteratively Conserving Synaptic Flow
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a 45-minute video lecture on SynFlow, a groundbreaking algorithm for pruning neural networks without using any data. Delve into the concept of the Lottery Ticket Hypothesis and understand why previous pruning attempts have failed. Learn about layer collapse, synaptic saliency conservation, and how iterative pruning can avoid these issues. Discover the SynFlow algorithm, which achieves maximum compression capacity by conserving synaptic flow. Examine experimental results and gain insights into this data-agnostic approach that challenges the notion that data is necessary to determine important synapses in neural networks.
Syllabus
- Intro & Overview
- Pruning Neural Networks
- Lottery Ticket Hypothesis
- Paper Story Overview
- Layer Collapse
- Synaptic Saliency Conservation
- Connecting Layer Collapse & Saliency Conservation
- Iterative Pruning avoids Layer Collapse
- The SynFlow Algorithm
- Experiments
- Conclusion & Comments
Taught by
Yannic Kilcher
Related Courses
Natural Language ProcessingColumbia University via Coursera Intro to Algorithms
Udacity Conception et mise en œuvre d'algorithmes.
École Polytechnique via Coursera Paradigms of Computer Programming
Université catholique de Louvain via edX Data Structures and Algorithm Design Part I | 数据结构与算法设计(上)
Tsinghua University via edX