YoVDO

Private Stochastic Optimization with Large Worst-Case Lipschitz Parameter

Offered By: USC Probability and Statistics Seminar via YouTube

Tags

Stochastic Optimization Courses Machine Learning Courses Gradient Descent Courses Convex Optimization Courses Differential Privacy Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore differentially private stochastic optimization in a 58-minute lecture from the USC Probability and Statistics Seminar. Delve into the challenges of loss functions with extremely large worst-case Lipschitz parameters due to outliers. Discover near-optimal excess risk bounds that overcome limitations of uniform Lipschitz assumptions, scaling with k-th moment bounds instead. Examine asymptotically optimal results for convex and strongly convex losses, as well as novel approaches for non-convex Proximal-PL functions. Learn about accelerated algorithms for smooth losses with tight excess risk in practical scenarios. Gain insights into addressing heavy-tailed data and outliers in private optimization, with applications to real-world machine learning problems.

Syllabus

Andrew Lowy: Private Stochastic Optimization with Large Worst-Case Lipschitz Parameter... (USC)


Taught by

USC Probability and Statistics Seminar

Related Courses

Statistical Machine Learning
Carnegie Mellon University via Independent
Secure and Private AI
Facebook via Udacity
Data Privacy and Anonymization in R
DataCamp
Build and operate machine learning solutions with Azure Machine Learning
Microsoft via Microsoft Learn
Data Privacy and Anonymization in Python
DataCamp