NFNets - High-Performance Large-Scale Image Recognition Without Normalization
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a comprehensive video analysis of the research paper "NFNets: High-Performance Large-Scale Image Recognition Without Normalization" from Google DeepMind. Delve into the innovative approach of Normalizer-Free Networks, which achieve state-of-the-art classification accuracy on ImageNet without using batch normalization. Learn about the advantages and disadvantages of BatchNorm, and discover how adaptive gradient clipping (AGC) and architectural improvements enable NFNets to outperform traditional models. Gain insights into the benefits of this new technique, including faster training, improved accuracy, and enhanced transfer learning performance. Follow along as the video breaks down the paper's key contributions, compares NFNets to EfficientNet, and discusses the implications for future deep learning research.
Syllabus
- Intro & Overview
- What's the problem with BatchNorm?
- Paper contribution Overview
- Beneficial properties of BatchNorm
- Previous work: NF-ResNets
- Adaptive Gradient Clipping
- AGC and large batch size
- AGC induces implicit dependence between training samples
- Are BatchNorm's problems solved?
- Network architecture improvements
- Comparison to EfficientNet
- Conclusion & Comments
Taught by
Yannic Kilcher
Related Courses
4.0 Shades of Digitalisation for the Chemical and Process IndustriesUniversity of Padova via FutureLearn A Day in the Life of a Data Engineer
Amazon Web Services via AWS Skill Builder FinTech for Finance and Business Leaders
ACCA via edX Accounting Data Analytics
University of Illinois at Urbana-Champaign via Coursera Accounting Data Analytics
Coursera