From Information Theory to Learning via Statistical Physics - Introduction
Offered By: International Centre for Theoretical Sciences via YouTube
Course Description
Overview
Explore the intersection of information theory, statistical physics, and machine learning in this comprehensive lecture by Florent Krzakala. Delve into topics such as classical statistics, high-dimensional statistics, signal processing, and regression, while examining their connections to statistical physics. Learn about Bayes rules, estimators, and Fisher information, and discover how these concepts apply to real-world problems. Investigate the relationship between statistical mechanics and machine learning, and understand the importance of Bayes risks in discrete problems. Gain insights into the interdisciplinary nature of these fields and their applications in solving complex physical and biological systems.
Syllabus
US-India Advanced Studies Institute: Classical and Quantum Information
From information theory to learning via Statistical physics: Introduction: Statistical learning, Bayes rules, estimators, and statistical physics
Topics
Connecting physics and information theory
Example 1: "Classical statistics"
Prove
Solve the problem
Assume uniform prior
Prove
Fischer information
Example 2: High dimension statistics
Signal processing
Regression
Statistical physics problem
Back to abasing formulation
Claim
Statistical mechanics
3. Estimated and base optimality
Bayes risks
Discrete problem
Summary
Taught by
International Centre for Theoretical Sciences
Related Courses
Statistical Mechanics: Algorithms and ComputationsÉcole normale supérieure via Coursera Physics of Materials
Indian Institute of Technology Madras via Swayam From Atoms to Materials: Predictive Theory and Simulations
Purdue University via edX Statistical Mechanics
Indian Institute of Technology Madras via Swayam Thermodynamics: Classical To Statistical
Indian Institute of Technology Guwahati via Swayam