YoVDO

Manifold Mixup: Better Representations by Interpolating Hidden States

Offered By: Yannic Kilcher via YouTube

Tags

Neural Networks Courses Supervised Learning Courses Information Theory Courses Adversarial Attacks Courses

Course Description

Overview

Explore the innovative regularization technique called Manifold Mixup in this informative video. Learn how this method addresses common issues in standard neural networks such as un-smooth classification boundaries and overconfidence. Discover the process of interpolating hidden representations of different data points and training them to predict equally interpolated labels. Understand how Manifold Mixup encourages neural networks to predict less confidently on interpolations of hidden representations, resulting in smoother decision boundaries at multiple levels of representation. Examine the theoretical foundations behind this technique and its practical applications in improving supervised learning, robustness to single-step adversarial attacks, and test log-likelihood. Gain insights into how Manifold Mixup helps neural networks learn class-representations with fewer directions of variance, and explore its connections to previous works on information theory and generalization.

Syllabus

Manifold Mixup: Better Representations by Interpolating Hidden States


Taught by

Yannic Kilcher

Related Courses

Information Theory
The Chinese University of Hong Kong via Coursera
Fundamentals of Electrical Engineering
Rice University via Coursera
Computational Neuroscience
University of Washington via Coursera
Introduction to Complexity
Santa Fe Institute via Complexity Explorer
Tutorials for Complex Systems
Santa Fe Institute via Complexity Explorer