Regularization of Big Neural Networks
Offered By: University of Central Florida via YouTube
Course Description
Overview
Syllabus
Intro
Big Neural Nets
Big Models Over-Fitting
Training with DropOut
DropOut/Connect Intuition
Theoretical Analysis of DropConnect
MNIST Results
Varying Size of Network
Varying Fraction Dropped
Comparison of Convergence Rates
Limitations of DropOut/Connect
Stochastic Pooling
Methods for Test Time
Varying Size of Training Set
Convergence / Over-Fitting
Street View House Numbers
Deconvolutional Networks
Recap: Sparse Coding (Patch-based)
Reversible Max Pooling
Single Layer Cost Function
Single Layer Inference
Effect of Sparsity
Effect of Pooling Variables
Talk Overview
Stacking the Layers
Two Layer Example
Link to Parts and Structure Models
Caltech 101 Experiments
Layer 2 Filters
Classification Results: Caltech 101
Deconvolutional + Convolutional
Summary
Taught by
UCF CRCV
Tags
Related Courses
Practical Machine LearningJohns Hopkins University via Coursera Practical Deep Learning For Coders
fast.ai via Independent 機器學習基石下 (Machine Learning Foundations)---Algorithmic Foundations
National Taiwan University via Coursera Data Analytics Foundations for Accountancy II
University of Illinois at Urbana-Champaign via Coursera Entraînez un modèle prédictif linéaire
CentraleSupélec via OpenClassrooms