YoVDO

Discovering Symbolic Models from Deep Learning with Inductive Biases

Offered By: Yannic Kilcher via YouTube

Tags

Inductive Bias Courses Physics Courses Deep Learning Courses Cosmology Courses

Course Description

Overview

Explore a detailed explanation of a research paper that combines Graph Neural Networks with symbolic regression to derive accurate symbolic equations from observational data. Delve into the problem of extracting discrete symbolic equations from neural networks, typically adept at predicting numerical outputs. Learn about symbolic regression, Graph Neural Networks, and their inductive biases for physics. Understand how Graph Networks compute outputs, loss backpropagation, and the analogies between Graph Networks and Newtonian mechanics. Discover the process of converting a Graph Network to an equation, including L1 regularization of edge messages. Examine practical examples in Newtonian dynamics and cosmology, including a novel analytic formula for predicting dark matter concentration. Gain insights into interpreting neural networks and uncovering new physical principles from learned representations.

Syllabus

- Intro & Outline
- Problem Statement
- Symbolic Regression
- Graph Neural Networks
- Inductive Biases for Physics
- How Graph Networks compute outputs
- Loss Backpropagation
- Graph Network Recap
- Analogies of GN to Newtonian Mechanics
- From Graph Network to Equation
- L1 Regularization of Edge Messages
- Newtonian Dynamics Example
- Cosmology Example
- Conclusions & Appendix


Taught by

Yannic Kilcher

Related Courses

FA17: Machine Learning
Georgia Institute of Technology via edX
Machine Learning
Georgia Institute of Technology via edX
Noether Networks - Meta-Learning Useful Conserved Quantities
Yannic Kilcher via YouTube
An Image is Worth 16x16 Words - Transformers for Image Recognition at Scale
Yannic Kilcher via YouTube
MIT EI Seminar - Phillip Isola - Emergent Intelligence- Getting More Out of Agents Than You Bake In
Massachusetts Institute of Technology via YouTube