Implicit Regularization I
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the concept of implicit regularization in deep learning through this comprehensive lecture from the Deep Learning Boot Camp. Delve into topics such as boosting, complexity control, optimization landscapes, and biases in matrix completion. Understand the goal of learning through practical examples and gain insights into stochastic optimization techniques. Examine the intricacies of gradient descent and stochastic gradient descent, and their roles in implicit regularization. Learn from Nati Srebro of the Toyota Technological Institute at Chicago as he provides an in-depth analysis of this crucial aspect of machine learning.
Syllabus
Introduction
Boosting
Complexity Control
Optimization Landscape
Biases
Matrix Completion
Gradient Descent
Outline
Goal of Learning
Example
Stochastic Optimization
Recap
Stochastic Gradient Descent
Taught by
Simons Institute
Related Courses
Analyzing Optimization and Generalization in Deep Learning via Dynamics of Gradient DescentSimons Institute via YouTube Can Non-Convex Optimization Be Robust?
Simons Institute via YouTube Finding Low-Rank Matrices - From Matrix Completion to Recent Trends
Simons Institute via YouTube Power of Active Sampling for Unsupervised Learning
Simons Institute via YouTube Is Optimization the Right Language to Understand Deep Learning? - Sanjeev Arora
Institute for Advanced Study via YouTube