YoVDO

Manifold Mixup: Better Representations by Interpolating Hidden States

Offered By: Yannic Kilcher via YouTube

Tags

Neural Networks Courses Supervised Learning Courses Information Theory Courses Adversarial Attacks Courses

Course Description

Overview

Explore the innovative regularization technique called Manifold Mixup in this informative video. Learn how this method addresses common issues in standard neural networks such as un-smooth classification boundaries and overconfidence. Discover the process of interpolating hidden representations of different data points and training them to predict equally interpolated labels. Understand how Manifold Mixup encourages neural networks to predict less confidently on interpolations of hidden representations, resulting in smoother decision boundaries at multiple levels of representation. Examine the theoretical foundations behind this technique and its practical applications in improving supervised learning, robustness to single-step adversarial attacks, and test log-likelihood. Gain insights into how Manifold Mixup helps neural networks learn class-representations with fewer directions of variance, and explore its connections to previous works on information theory and generalization.

Syllabus

Manifold Mixup: Better Representations by Interpolating Hidden States


Taught by

Yannic Kilcher

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX