YoVDO

Understanding Deep Learning Requires Rethinking Generalization

Offered By: University of Central Florida via YouTube

Tags

Deep Learning Courses Data Augmentation Courses Implicit Regularization Courses

Course Description

Overview

Explore the intricacies of deep learning and challenge conventional wisdom on generalization in this 40-minute lecture from the University of Central Florida. Delve into topics such as the Universal Approximation Theorem, L2 Regularization, Dropout, and Data Augmentation. Examine randomization tests and their results, leading to thought-provoking conclusions and implications. Investigate explicit and implicit regularization techniques, finite-sample expressivity of neural networks, and draw comparisons to linear models. Conclude by analyzing the role of Stochastic Gradient Descent (SGD) in deep learning, ultimately reshaping your understanding of generalization in neural networks.

Syllabus

Intro
Presentation Outline
Universal Approximation Theorem
L2 Regularization - "Weight Decay"
Dropout
Data Augmentation
Rondomization Tests
Results of Randomization Tests
Conclusions & Impications
Explicit Regularization Tests
Implicit Regularization Findings
Finite-Sample Expressivity of Neural Networks
Appeal to Linear Models
Investigating SGD
Final Conclusions


Taught by

UCF CRCV

Tags

Related Courses

TensorFlow を使った畳み込みニューラルネットワーク
DeepLearning.AI via Coursera
Emotion AI: Facial Key-points Detection
Coursera Project Network via Coursera
Transfer Learning for Food Classification
Coursera Project Network via Coursera
Facial Expression Classification Using Residual Neural Nets
Coursera Project Network via Coursera
Apply Generative Adversarial Networks (GANs)
DeepLearning.AI via Coursera