Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a comprehensive video analysis of the groundbreaking paper "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift." Delve into the concept of internal covariate shift and its impact on deep neural network training. Learn how batch normalization addresses this issue by normalizing layer inputs, allowing for higher learning rates and less careful parameter initialization. Discover how this technique acts as a regularizer, potentially eliminating the need for dropout. Examine the impressive results achieved when applying batch normalization to state-of-the-art image classification models, including significant improvements in training speed and accuracy. Gain insights into the paper's methodology, implementation details, and its impact on the field of deep learning.
Syllabus
Introduction
What is Batch Normalization
Training
Back Propagation
Results
Taught by
Yannic Kilcher
Related Courses
Data Analysis and VisualizationGeorgia Institute of Technology via Udacity Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
DeepLearning.AI via Coursera 機器學習基石下 (Machine Learning Foundations)---Algorithmic Foundations
National Taiwan University via Coursera Data Science: Machine Learning
Harvard University via edX Art and Science of Machine Learning auf Deutsch
Google Cloud via Coursera