YoVDO

Fine-Grained Poisoning Attack to Local Differential Privacy Protocols for Mean and Variance Estimation

Offered By: USENIX via YouTube

Tags

USENIX Security Courses Data Analysis Courses Cybersecurity Courses Algorithm Design Courses Local Differential Privacy Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a cutting-edge research presentation on data poisoning attacks against local differential privacy (LDP) protocols. Delve into the proposed fine-grained attack that manipulates mean and variance estimations in LDP systems. Learn about the novel output poisoning attack (OPA) technique, which injects fake data into the output domain of local LDP instances. Examine the security-privacy consistency observed in LDP and gain insights into the evolving threat landscape of data poisoning attacks. Compare the effectiveness of OPA against baseline attacks using real-world datasets. Discover a new defense method for recovering result accuracy from polluted data collections and understand implications for secure LDP design. This 20-minute conference talk from USENIX Security '23 offers valuable knowledge for researchers and practitioners in the fields of privacy-preserving data analysis and cybersecurity.

Syllabus

USENIX Security '23 - Fine-grained Poisoning Attack to Local Differential Privacy Protocols for...


Taught by

USENIX

Related Courses

Natural Language Processing
Columbia University via Coursera
Intro to Algorithms
Udacity
Conception et mise en œuvre d'algorithmes.
École Polytechnique via Coursera
Paradigms of Computer Programming
Université catholique de Louvain via edX
Data Structures and Algorithm Design Part I | 数据结构与算法设计(上)
Tsinghua University via edX