NFNets - High-Performance Large-Scale Image Recognition Without Normalization
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a comprehensive video analysis of the research paper "NFNets: High-Performance Large-Scale Image Recognition Without Normalization" from Google DeepMind. Delve into the innovative approach of Normalizer-Free Networks, which achieve state-of-the-art classification accuracy on ImageNet without using batch normalization. Learn about the advantages and disadvantages of BatchNorm, and discover how adaptive gradient clipping (AGC) and architectural improvements enable NFNets to outperform traditional models. Gain insights into the benefits of this new technique, including faster training, improved accuracy, and enhanced transfer learning performance. Follow along as the video breaks down the paper's key contributions, compares NFNets to EfficientNet, and discusses the implications for future deep learning research.
Syllabus
- Intro & Overview
- What's the problem with BatchNorm?
- Paper contribution Overview
- Beneficial properties of BatchNorm
- Previous work: NF-ResNets
- Adaptive Gradient Clipping
- AGC and large batch size
- AGC induces implicit dependence between training samples
- Are BatchNorm's problems solved?
- Network architecture improvements
- Comparison to EfficientNet
- Conclusion & Comments
Taught by
Yannic Kilcher
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX