Federated Learning with Formal User-Level Differential Privacy Guarantees
Offered By: TheIACR via YouTube
Course Description
Overview
Explore federated learning with formal user-level differential privacy guarantees in this 59-minute invited talk from PPML 2022. Delve into topics such as non-convex learning, cross-device federated learning, differentially private stochastic gradient descent, and DP-Federated Averaging. Examine challenges in amplification by sampling, noise accumulation in prefix sums, and tree aggregation. Investigate DP-Follow-the-regularized leader (DP-FTRL) and its online learning properties. Analyze privacy-utility trade-offs using Stackoverflow as an example, and discover a production model with formal differential privacy. Gain insights into matrix factorization views of prefix sum estimation and DP prefix sum. Conclude with future directions and acknowledgements in this comprehensive exploration of privacy-preserving machine learning techniques.
Syllabus
Intro
(Non)-convex learning
Differential Privacy
Cross-device federated learning
Differentially private stochastic gradient descent DP
DP-SGD: Key insights
DP-Federated Averaging (DP-FedAvg)
Challenges for Amplification by Sampling in FL
Deconstructing the SGD model update
Noise Accumulation in Prefix Sums
Towards Tree Aggregation
Interlude: Follow-the-regularized-leader (FTRL)
DP-Follow-the-regularized leader (DP-FTRL)
DP-FTRL: Online learning properties
Privacy-Utility Trade-offs for Stackoverflow
Production model with formal DP
Matrix factorization view of prefix sum estimation
Matrix factorization view of DP prefix sum
Future directions
Acknowledgements
Taught by
TheIACR
Related Courses
Building Classification Models with scikit-learnPluralsight Practical Deep Learning for Coders - Full Course
freeCodeCamp Neural Networks Made Easy
Udemy Intro to Deep Learning
Kaggle Stochastic Gradient Descent
Great Learning via YouTube