The Information Bottleneck Theory of Deep Neural Networks
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the Information Bottleneck Theory of Deep Neural Networks in this lecture by Naftali Tishby from the Hebrew University of Jerusalem. Delve into statistical learning theory, neural network applications, and information theory. Examine concepts such as soft partitioning, information plan, and stochastic gradient descent. Analyze the average per layer, classical theory, dimensionality, confidence, factorization, cardinality, and the ultimate bound. Gain insights into targeted discovery in brain data and expand your understanding of deep neural networks through this comprehensive presentation from the Simons Institute.
Syllabus
Intro
Statistical Learning Theory
Neural Network Applications
Information Theory
Soft Partitioning
Information Plan
Stochastic Gradient Descent
Average Per Layer
Classical Theory
Dimensionality
Confidence
Factorization
Cardinality
The Ultimate Bound
Taught by
Simons Institute
Related Courses
Information TheoryThe Chinese University of Hong Kong via Coursera Fundamentals of Electrical Engineering
Rice University via Coursera Computational Neuroscience
University of Washington via Coursera Introduction to Complexity
Santa Fe Institute via Complexity Explorer Tutorials for Complex Systems
Santa Fe Institute via Complexity Explorer