Attacking Optical Flow
Offered By: Andreas Geiger via YouTube
Course Description
Overview
Explore a keynote presentation on attacking optical flow in deep neural networks for automated driving safety. Delve into the vulnerability of state-of-the-art optical flow estimation techniques to adversarial attacks, particularly focusing on patch attacks that can significantly compromise performance. Examine the differences in susceptibility between encoder-decoder and spatial pyramid network architectures. Learn about various types of attacks, including white-box, black-box, and real-world scenarios, as well as their implications for self-driving technology. Gain insights into the importance of robust artificial intelligence in safety-critical applications and the challenges of situational driving and data aggregation.
Syllabus
Intro
Collaborators
Self-Driving must be Robust
Situational Driving
Data Aggregation
Adversarial Attacks on Image Classification
Adversarial Attacks on Semantic Segmentation
Physical Adversarial Attacks
Robust Adversarial Attacks
Adversarial Patch Attacks
Low-Level Perception
Motion Estimation
Variational Optical Flow
Encoder-Decoder Networks
Spatial Pyramid Networks
Motivation
Attacking Optical Flow
White Box Attacks
Black-Box Attacks
Real-World Attack
Zero-Flow Test
Summary
Taught by
Andreas Geiger
Related Courses
Stereo Vision, Dense Motion & TrackingUniversity at Buffalo via Coursera Computer Vision - Object Tracking with OpenCV and Python
Coursera Project Network via Coursera 3D Reconstruction - Multiple Viewpoints
Columbia University via Coursera Computer Vision for Visual Effects Lectures Spring 2014
Rensselaer Polytechnic Institute via YouTube Object Tracking and Motion Detection with Computer Vision
MathWorks via Coursera