Discovering Symbolic Models from Deep Learning with Inductive Biases
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a detailed explanation of a research paper that combines Graph Neural Networks with symbolic regression to derive accurate symbolic equations from observational data. Delve into the problem of extracting discrete symbolic equations from neural networks, typically adept at predicting numerical outputs. Learn about symbolic regression, Graph Neural Networks, and their inductive biases for physics. Understand how Graph Networks compute outputs, loss backpropagation, and the analogies between Graph Networks and Newtonian mechanics. Discover the process of converting a Graph Network to an equation, including L1 regularization of edge messages. Examine practical examples in Newtonian dynamics and cosmology, including a novel analytic formula for predicting dark matter concentration. Gain insights into interpreting neural networks and uncovering new physical principles from learned representations.
Syllabus
- Intro & Outline
- Problem Statement
- Symbolic Regression
- Graph Neural Networks
- Inductive Biases for Physics
- How Graph Networks compute outputs
- Loss Backpropagation
- Graph Network Recap
- Analogies of GN to Newtonian Mechanics
- From Graph Network to Equation
- L1 Regularization of Edge Messages
- Newtonian Dynamics Example
- Cosmology Example
- Conclusions & Appendix
Taught by
Yannic Kilcher
Related Courses
Arab-Muslim Philosophy: Generating new meaning of the worldE-Learning Development Fund via Coursera Astronomy: Exploring Time and Space
University of Arizona via Coursera Астрофизика: от звезд до границ Вселенной
St. Petersburg State Polytechnic University via Coursera Confronting The Big Questions: Highlights of Modern Astronomy
University of Rochester via Coursera Astrophysics: Cosmology
Australian National University via edX