YoVDO

Discovering Symbolic Models from Deep Learning with Inductive Biases

Offered By: Yannic Kilcher via YouTube

Tags

Inductive Bias Courses Physics Courses Deep Learning Courses Cosmology Courses

Course Description

Overview

Explore a detailed explanation of a research paper that combines Graph Neural Networks with symbolic regression to derive accurate symbolic equations from observational data. Delve into the problem of extracting discrete symbolic equations from neural networks, typically adept at predicting numerical outputs. Learn about symbolic regression, Graph Neural Networks, and their inductive biases for physics. Understand how Graph Networks compute outputs, loss backpropagation, and the analogies between Graph Networks and Newtonian mechanics. Discover the process of converting a Graph Network to an equation, including L1 regularization of edge messages. Examine practical examples in Newtonian dynamics and cosmology, including a novel analytic formula for predicting dark matter concentration. Gain insights into interpreting neural networks and uncovering new physical principles from learned representations.

Syllabus

- Intro & Outline
- Problem Statement
- Symbolic Regression
- Graph Neural Networks
- Inductive Biases for Physics
- How Graph Networks compute outputs
- Loss Backpropagation
- Graph Network Recap
- Analogies of GN to Newtonian Mechanics
- From Graph Network to Equation
- L1 Regularization of Edge Messages
- Newtonian Dynamics Example
- Cosmology Example
- Conclusions & Appendix


Taught by

Yannic Kilcher

Related Courses

Networked Life
University of Pennsylvania via Coursera
Intro to Physics
Udacity
How Things Work: An Introduction to Physics
University of Virginia via Coursera
Solar: Solar Cells, Fuel Cells and Batteries
Stanford University via Stanford OpenEdx
A Look at Nuclear Science and Technology
University of Pittsburgh via Coursera