Manifold Mixup: Better Representations by Interpolating Hidden States
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore the innovative regularization technique called Manifold Mixup in this informative video. Learn how this method addresses common issues in standard neural networks such as un-smooth classification boundaries and overconfidence. Discover the process of interpolating hidden representations of different data points and training them to predict equally interpolated labels. Understand how Manifold Mixup encourages neural networks to predict less confidently on interpolations of hidden representations, resulting in smoother decision boundaries at multiple levels of representation. Examine the theoretical foundations behind this technique and its practical applications in improving supervised learning, robustness to single-step adversarial attacks, and test log-likelihood. Gain insights into how Manifold Mixup helps neural networks learn class-representations with fewer directions of variance, and explore its connections to previous works on information theory and generalization.
Syllabus
Manifold Mixup: Better Representations by Interpolating Hidden States
Taught by
Yannic Kilcher
Related Courses
Machine LearningUniversity of Washington via Coursera Machine Learning
Stanford University via Coursera Machine Learning
Georgia Institute of Technology via Udacity Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity