YoVDO

Attacking Optical Flow

Offered By: Andreas Geiger via YouTube

Tags

Adversarial Attacks Courses Image Classification Courses Optical Flow Courses Deep Neural Networks Courses Semantic Segmentation Courses

Course Description

Overview

Explore a keynote presentation on attacking optical flow in deep neural networks for automated driving safety. Delve into the vulnerability of state-of-the-art optical flow estimation techniques to adversarial attacks, particularly focusing on patch attacks that can significantly compromise performance. Examine the differences in susceptibility between encoder-decoder and spatial pyramid network architectures. Learn about various types of attacks, including white-box, black-box, and real-world scenarios, as well as their implications for self-driving technology. Gain insights into the importance of robust artificial intelligence in safety-critical applications and the challenges of situational driving and data aggregation.

Syllabus

Intro
Collaborators
Self-Driving must be Robust
Situational Driving
Data Aggregation
Adversarial Attacks on Image Classification
Adversarial Attacks on Semantic Segmentation
Physical Adversarial Attacks
Robust Adversarial Attacks
Adversarial Patch Attacks
Low-Level Perception
Motion Estimation
Variational Optical Flow
Encoder-Decoder Networks
Spatial Pyramid Networks
Motivation
Attacking Optical Flow
White Box Attacks
Black-Box Attacks
Real-World Attack
Zero-Flow Test
Summary


Taught by

Andreas Geiger

Related Courses

Advanced AI Techniques for the Supply Chain
LearnQuest via Coursera
Advanced Computer Vision with TensorFlow
DeepLearning.AI via Coursera
Analizando imágenes con Amazon Rekognition
Coursera Project Network via Coursera
AutoML avec AutoKeras - Classification d'images
Coursera Project Network via Coursera
AWS SimuLearn: Get Home Safe
Amazon Web Services via AWS Skill Builder