YoVDO

The Information Bottleneck Theory of Deep Neural Networks

Offered By: Simons Institute via YouTube

Tags

Deep Neural Networks Courses Information Theory Courses Factorization Courses Confidence Courses Stochastic Gradient Descent Courses Statistical Learning Theory Courses

Course Description

Overview

Explore the Information Bottleneck Theory of Deep Neural Networks in this lecture by Naftali Tishby from the Hebrew University of Jerusalem. Delve into statistical learning theory, neural network applications, and information theory. Examine concepts such as soft partitioning, information plan, and stochastic gradient descent. Analyze the average per layer, classical theory, dimensionality, confidence, factorization, cardinality, and the ultimate bound. Gain insights into targeted discovery in brain data and expand your understanding of deep neural networks through this comprehensive presentation from the Simons Institute.

Syllabus

Intro
Statistical Learning Theory
Neural Network Applications
Information Theory
Soft Partitioning
Information Plan
Stochastic Gradient Descent
Average Per Layer
Classical Theory
Dimensionality
Confidence
Factorization
Cardinality
The Ultimate Bound


Taught by

Simons Institute

Related Courses

Statistical Machine Learning
Eberhard Karls University of Tübingen via YouTube
Interpolation and Learning With Scale Dependent Kernels
MITCBMM via YouTube
Statistical Learning Theory and Applications - Class 16
MITCBMM via YouTube
Statistical Learning Theory and Applications - Class 6
MITCBMM via YouTube
Statistical Learning Theory and Applications - Class 7
MITCBMM via YouTube