The Information Knot - Tying Sensing and Action; Emergence Theory of Representation Learning
Offered By: New York University (NYU) via YouTube
Course Description
Overview
Explore the intricacies of representation learning in artificial intelligence through this 55-minute seminar by Stefano Soatto at New York University. Delve into the Information Knot Tying Sensing & Action and Emergence Theory of Representation Learning. Examine key concepts such as representation sufficiency, information bottleneck, mutual information, and cost functionals. Investigate the representation of past data, disentangling, bias-variance tradeoff, and local entropy solutions. Gain insights into flat minima, limit cycles, eigen values, and the Pocket Planck Equation. Analyze the implications of standard Gaussian relaxation and consider future directions in this field of study.
Syllabus
Introduction
Background
Tenets
Overview
Representation
sufficiency
information bottleneck
the task
mutual information
cost functional
Representation of Past Data
Two Information bottlenecks
Results
Disentangling
Bias
Leibler divergence
Flat minima
Biasvariance tradeoff
Notation
Pocket Planck Equation
Limit Cycles
Eigen Values
Local Entropy
Local Entropy Solution
Standard Gaussian Relaxation
Where do we take this
What does the steering not cover
Taught by
NYU Tandon School of Engineering
Tags
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Artificial Intelligence for Robotics
Stanford University via Udacity Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent