Discovering Symbolic Models from Deep Learning with Inductive Biases
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a detailed explanation of a research paper that combines Graph Neural Networks with symbolic regression to derive accurate symbolic equations from observational data. Delve into the problem of extracting discrete symbolic equations from neural networks, typically adept at predicting numerical outputs. Learn about symbolic regression, Graph Neural Networks, and their inductive biases for physics. Understand how Graph Networks compute outputs, loss backpropagation, and the analogies between Graph Networks and Newtonian mechanics. Discover the process of converting a Graph Network to an equation, including L1 regularization of edge messages. Examine practical examples in Newtonian dynamics and cosmology, including a novel analytic formula for predicting dark matter concentration. Gain insights into interpreting neural networks and uncovering new physical principles from learned representations.
Syllabus
- Intro & Outline
- Problem Statement
- Symbolic Regression
- Graph Neural Networks
- Inductive Biases for Physics
- How Graph Networks compute outputs
- Loss Backpropagation
- Graph Network Recap
- Analogies of GN to Newtonian Mechanics
- From Graph Network to Equation
- L1 Regularization of Edge Messages
- Newtonian Dynamics Example
- Cosmology Example
- Conclusions & Appendix
Taught by
Yannic Kilcher
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX