Regularization of Big Neural Networks
Offered By: University of Central Florida via YouTube
Course Description
Overview
Syllabus
Intro
Big Neural Nets
Big Models Over-Fitting
Training with DropOut
DropOut/Connect Intuition
Theoretical Analysis of DropConnect
MNIST Results
Varying Size of Network
Varying Fraction Dropped
Comparison of Convergence Rates
Limitations of DropOut/Connect
Stochastic Pooling
Methods for Test Time
Varying Size of Training Set
Convergence / Over-Fitting
Street View House Numbers
Deconvolutional Networks
Recap: Sparse Coding (Patch-based)
Reversible Max Pooling
Single Layer Cost Function
Single Layer Inference
Effect of Sparsity
Effect of Pooling Variables
Talk Overview
Stacking the Layers
Two Layer Example
Link to Parts and Structure Models
Caltech 101 Experiments
Layer 2 Filters
Classification Results: Caltech 101
Deconvolutional + Convolutional
Summary
Taught by
UCF CRCV
Tags
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX