YoVDO

Large Scale Machine Learning and Convex Optimization - Lecture 1

Offered By: Hausdorff Center for Mathematics via YouTube

Tags

Convex Optimization Courses Big Data Courses Algorithm Design Courses Stochastic Gradient Descent Courses

Course Description

Overview

Explore the intersection of large-scale machine learning and convex optimization in this comprehensive lecture by Francis Bach. Delve into the challenges of handling extensive datasets with numerous observations and high-dimensional features. Learn about online algorithms like stochastic gradient descent and their advantages over batch algorithms for processing large-scale data. Examine the optimal convergence rates for general convex functions and strongly-convex functions. Discover how the smoothness of loss functions can be leveraged to develop innovative algorithms with improved performance. Investigate a novel Newton-based stochastic approximation algorithm that achieves faster convergence rates without strong convexity assumptions. Gain insights into the practical applications of combining batch and online algorithms, including the potential for linear convergence rates in strongly convex problems with computational efficiency comparable to stochastic gradient descent.

Syllabus

Francis Bach: Large scale Machine Learning and Convex Optimization (Lecture 1)


Taught by

Hausdorff Center for Mathematics

Related Courses

Building Classification Models with scikit-learn
Pluralsight
Practical Deep Learning for Coders - Full Course
freeCodeCamp
Neural Networks Made Easy
Udemy
Intro to Deep Learning
Kaggle
Stochastic Gradient Descent
Great Learning via YouTube