YoVDO

Fine-Grained Poisoning Attack to Local Differential Privacy Protocols for Mean and Variance Estimation

Offered By: USENIX via YouTube

Tags

USENIX Security Courses Data Analysis Courses Cybersecurity Courses Algorithm Design Courses Local Differential Privacy Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a cutting-edge research presentation on data poisoning attacks against local differential privacy (LDP) protocols. Delve into the proposed fine-grained attack that manipulates mean and variance estimations in LDP systems. Learn about the novel output poisoning attack (OPA) technique, which injects fake data into the output domain of local LDP instances. Examine the security-privacy consistency observed in LDP and gain insights into the evolving threat landscape of data poisoning attacks. Compare the effectiveness of OPA against baseline attacks using real-world datasets. Discover a new defense method for recovering result accuracy from polluted data collections and understand implications for secure LDP design. This 20-minute conference talk from USENIX Security '23 offers valuable knowledge for researchers and practitioners in the fields of privacy-preserving data analysis and cybersecurity.

Syllabus

USENIX Security '23 - Fine-grained Poisoning Attack to Local Differential Privacy Protocols for...


Taught by

USENIX

Related Courses

CALM - Consistent Adaptive Local Marginal for Marginal Release under Local Differential Privacy
Association for Computing Machinery (ACM) via YouTube
Heavy Hitter Estimation over Set-Valued Data with Local Differential Privacy
Association for Computing Machinery (ACM) via YouTube
Jelani Nelson- Forty Years of Frequent Items
International Mathematical Union via YouTube
Utility-Optimized Local Differential Privacy Mechanisms for Distribution Estimation
USENIX via YouTube
Manipulation Attacks in Local Differential Privacy
IEEE via YouTube