Information Theory with Kernel Methods
Offered By: Conference GSI via YouTube
Course Description
Overview
Explore the intersection of information theory and kernel methods in this 59-minute conference talk from GSI. Dive into the challenges of estimating and computing entropies of probability distributions in data science. Examine situations where distributions are known only through feature vector expectations, focusing on the case of rank-one positive definite matrices. Discover how covariance matrices can be utilized with information divergences from quantum information theory to establish connections with classical Shannon entropies. Gain insights into advanced computational techniques for handling complex probability distributions in various data science applications.
Syllabus
INFORMATION THEORY WITH KERNEL METHODS
Taught by
Conference GSI
Related Courses
機器學習技法 (Machine Learning Techniques)National Taiwan University via Coursera Utilisez des modèles supervisés non linéaires
CentraleSupélec via OpenClassrooms Statistical Machine Learning
Eberhard Karls University of Tübingen via YouTube Interplay of Linear Algebra, Machine Learning, and HPC - JuliaCon 2021 Keynote
The Julia Programming Language via YouTube Interpolation and Learning With Scale Dependent Kernels
MITCBMM via YouTube