YoVDO

SGD in the Large - Average-Case Analysis, Asymptotics, and Stepsize Criticality

Offered By: Fields Institute via YouTube

Tags

Machine Learning Courses Asymptotics Courses Logistic Regression Courses Optimization Problems Courses Stochastic Gradient Descent Courses

Course Description

Overview

Explore the intricacies of Stochastic Gradient Descent (SGD) in a comprehensive lecture delivered by Courtney Paquette from McGill University at the Machine Learning Advances and Applications Seminar. Delve into average-case analysis, asymptotics, and stepsize criticality as key components of SGD. Examine optimization problems, average-case complexity, and the role of randomness and distribution in SGD. Investigate the SGD congruence theorem and its implications for worst-case scenarios. Uncover the nuances of stepsize criticality and its impact on average-case complexity. Learn about stochastic momentum, the stochastic heavy ball method, and the significance of momentum parameters. Discover dimension-dependent momentum and its applications in logistic regression. Gain valuable insights into the average-case analysis of SGD and its relevance in machine learning applications.

Syllabus

Introduction
Optimization Problems
Averagecase Complexity
Randomness
Distribution
SGD Example
SGD Worst Case
SGD Congruence Theorem
Stepsize Criticality
Average Case Complexity
Stochastic Momentum
Stochastic Heavy Ball
Momentum Parameters
Dimension Dependent Momentum
Thank You
Logistic Regression
Averagecase Analysis


Taught by

Fields Institute

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent