YoVDO

Stochastic Gradient Descent and Machine Learning - Lecture 1

Offered By: International Centre for Theoretical Sciences via YouTube

Tags

Machine Learning Courses Gradient Descent Courses Convex Functions Courses Newton's Method Courses Stochastic Gradient Descent Courses

Course Description

Overview

Dive into the fundamentals of optimization and machine learning in this comprehensive lecture on Stochastic Gradient Descent. Explore five different facets of optimization, including iterative methods, gradient descent, and Newton's method. Gain insights into the cheap gradient principle, fixed points of gradient descent, and the concept of convexity. Examine various examples of convex functions and delve into important theorems and proofs. Learn about subgradients of convex functions and their applications. This in-depth session, part of the Bangalore School on Statistical Physics XIII, provides a solid foundation for understanding the core principles of optimization techniques used in machine learning algorithms.

Syllabus

Stochastic Gradient Descent and Machine Learning Lecture 1
5 different facets of optimization
Optimization
1. Iterative methods
Blackbox oracles
2. Gradient descent
3. Newton's method
Cheap gradient principle
Fixed points of GD
Proposition
Proof
Convexity
Examples of convex functions
Theorem
Proof
gx is subgradient of a convex function f at x
Example
Theorem
Claim
Wrap Up


Taught by

International Centre for Theoretical Sciences

Related Courses

Calculus of One Real Variable
Indian Institute of Technology Kanpur via Swayam
Operations Research (2): Optimization Algorithms
National Taiwan University via Coursera
Dynamics of Physical System
NPTEL via YouTube
Learn Calculus 2 & 3 from scratch to Advanced
Udemy
Applications of Calculus
Eddie Woo via YouTube