A Fast Algorithm for Adaptive Private Mean Estimation in Machine Learning
Offered By: Google TechTalks via YouTube
Course Description
Overview
Explore a Google TechTalk on a fast algorithm for adaptive private mean estimation, presented by John Duchi as part of the Privacy ML series. Delve into the design of an (ε,δ)-differentially private algorithm for estimating the mean of a d-variate distribution with unknown covariance Σ. Learn how this algorithm achieves optimal convergence rates with respect to the induced Mahalanobis norm, computes in O~(nd2) time, and offers near-linear sample complexity for sub-Gaussian distributions. Discover its ability to handle degenerate or low-rank Σ and adaptively extend beyond sub-Gaussianity. Understand the significance of this work in overcoming previous limitations of exponential computation time or superlinear scaling. Examine topics such as the Laplace Mechanism, truncation, Coin Press, the proposed Test Release Framework, and stable mean estimation. Access the related paper on arXiv for further insights into this groundbreaking approach to private mean estimation.
Syllabus
Introduction
The Problem
LaPlace Mechanism
Onedimensional mean
Truncation
Naive Case
Adaptive Mean Estimation
Coin Press
Proposed Test Release Framework
Two Phase Approach
Sample Covariance
Stable Mean Estimation
Building Blocks
Taught by
Google TechTalks
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Natural Language Processing
Columbia University via Coursera Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent