From Information Theory to Learning via Statistical Physics - Introduction
Offered By: International Centre for Theoretical Sciences via YouTube
Course Description
Overview
Explore the intersection of information theory, statistical physics, and machine learning in this comprehensive lecture by Florent Krzakala. Delve into topics such as classical statistics, high-dimensional statistics, signal processing, and regression, while examining their connections to statistical physics. Learn about Bayes rules, estimators, and Fisher information, and discover how these concepts apply to real-world problems. Investigate the relationship between statistical mechanics and machine learning, and understand the importance of Bayes risks in discrete problems. Gain insights into the interdisciplinary nature of these fields and their applications in solving complex physical and biological systems.
Syllabus
US-India Advanced Studies Institute: Classical and Quantum Information
From information theory to learning via Statistical physics: Introduction: Statistical learning, Bayes rules, estimators, and statistical physics
Topics
Connecting physics and information theory
Example 1: "Classical statistics"
Prove
Solve the problem
Assume uniform prior
Prove
Fischer information
Example 2: High dimension statistics
Signal processing
Regression
Statistical physics problem
Back to abasing formulation
Claim
Statistical mechanics
3. Estimated and base optimality
Bayes risks
Discrete problem
Summary
Taught by
International Centre for Theoretical Sciences
Related Courses
Bioelectricity: A Quantitative ApproachDuke University via Coursera Animal Behaviour
University of Melbourne via Coursera Epigenetic Control of Gene Expression
University of Melbourne via Coursera Introduction to Systems Biology
Icahn School of Medicine at Mount Sinai via Coursera Network Analysis in Systems Biology
Icahn School of Medicine at Mount Sinai via Coursera