YoVDO

SGD in the Large - Average-Case Analysis, Asymptotics, and Stepsize Criticality

Offered By: Fields Institute via YouTube

Tags

Machine Learning Courses Asymptotics Courses Logistic Regression Courses Optimization Problems Courses Stochastic Gradient Descent Courses

Course Description

Overview

Explore the intricacies of Stochastic Gradient Descent (SGD) in a comprehensive lecture delivered by Courtney Paquette from McGill University at the Machine Learning Advances and Applications Seminar. Delve into average-case analysis, asymptotics, and stepsize criticality as key components of SGD. Examine optimization problems, average-case complexity, and the role of randomness and distribution in SGD. Investigate the SGD congruence theorem and its implications for worst-case scenarios. Uncover the nuances of stepsize criticality and its impact on average-case complexity. Learn about stochastic momentum, the stochastic heavy ball method, and the significance of momentum parameters. Discover dimension-dependent momentum and its applications in logistic regression. Gain valuable insights into the average-case analysis of SGD and its relevance in machine learning applications.

Syllabus

Introduction
Optimization Problems
Averagecase Complexity
Randomness
Distribution
SGD Example
SGD Worst Case
SGD Congruence Theorem
Stepsize Criticality
Average Case Complexity
Stochastic Momentum
Stochastic Heavy Ball
Momentum Parameters
Dimension Dependent Momentum
Thank You
Logistic Regression
Averagecase Analysis


Taught by

Fields Institute

Related Courses

AI for Medical Prognosis
DeepLearning.AI via Coursera
Analysis and Interpretation of Data
Queen Mary University of London via Coursera
The Analytics Edge
Massachusetts Institute of Technology via edX
Практическое использование анализа данных для финансов
E-Learning Development Fund via Coursera
Aprendizaje de máquinas
Universidad Nacional Autónoma de México via Coursera