YoVDO

Unveiling Hidden Backdoors in Manifold Distribution Gaps

Offered By: BIMSA via YouTube

Tags

Machine Learning Security Courses Data Security Courses Classification Models Courses Deep Neural Networks Courses Adversarial Machine Learning Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the critical security concern of backdoor attacks on deep neural networks in this 55-minute conference talk from ICBS2024. Delve into the innovative approach of separating classification models into manifold embedding and classifier components. Discover how mode mixture features within manifold distribution gaps can be exploited as backdoors to extend decision boundaries. Learn about a universal backdoor attack framework applicable across various data modalities, offering high explainability and stealthiness. Examine the effectiveness of this method on high-dimensional natural datasets and gain insights into the potential vulnerabilities of classification models.

Syllabus

Min Zhang: Unveiling Hidden Backdoors in Manifold Distribution Gaps #ICBS2024


Taught by

BIMSA

Related Courses

Sequences, Time Series and Prediction
DeepLearning.AI via Coursera
A Beginners Guide to Data Science
Udemy
Artificial Neural Networks(ANN) Made Easy
Udemy
Makine Mühendisleri için Derin Öğrenme
Udemy
Customer Analytics in Python
Udemy