YoVDO

The Information Bottleneck Theory of Deep Neural Networks

Offered By: Simons Institute via YouTube

Tags

Deep Neural Networks Courses Information Theory Courses Factorization Courses Confidence Courses Stochastic Gradient Descent Courses Statistical Learning Theory Courses

Course Description

Overview

Explore the Information Bottleneck Theory of Deep Neural Networks in this lecture by Naftali Tishby from the Hebrew University of Jerusalem. Delve into statistical learning theory, neural network applications, and information theory. Examine concepts such as soft partitioning, information plan, and stochastic gradient descent. Analyze the average per layer, classical theory, dimensionality, confidence, factorization, cardinality, and the ultimate bound. Gain insights into targeted discovery in brain data and expand your understanding of deep neural networks through this comprehensive presentation from the Simons Institute.

Syllabus

Intro
Statistical Learning Theory
Neural Network Applications
Information Theory
Soft Partitioning
Information Plan
Stochastic Gradient Descent
Average Per Layer
Classical Theory
Dimensionality
Confidence
Factorization
Cardinality
The Ultimate Bound


Taught by

Simons Institute

Related Courses

Álgebra básica
Universidad Nacional Autónoma de México via Coursera
A Basic Course in Number Theory
Indian Institute of Technology Bombay via Swayam
Applied Quantum Computing III: Algorithm and Software
Purdue University via edX
Advanced Functions: A Complete Course on Precalculus
Udemy
Master Number Theory 2020: The Secrets Of Numbers
Udemy