Deep Learning for Symbolic Mathematics
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a groundbreaking video that delves into the application of neural networks for solving complex mathematical problems like symbolic integration and differential equations. Learn about a novel syntax for representing mathematical problems and methods for generating large datasets to train sequence-to-sequence models. Discover how this approach outperforms commercial Computer Algebra Systems such as Matlab and Mathematica. Examine the paper by Guillaume Lample and François Charton, which challenges the notion that neural networks are limited to statistical or approximate problems. Gain insights into the use of Reverse Polish Notation and understand the intricacies of the model's functionality. Follow along as the video breaks down the process of integration and discusses important caveats in this innovative approach to symbolic mathematics.
Syllabus
Intro
Paper
How they did it
Reverse Polish Notation
How it works
Integration
Caveat
Taught by
Yannic Kilcher
Related Courses
Scientific ComputingUniversity of Washington via Coursera Dynamical Modeling Methods for Systems Biology
Icahn School of Medicine at Mount Sinai via Coursera Elements of Structures
Massachusetts Institute of Technology via edX Analyse numérique pour ingénieurs
École Polytechnique Fédérale de Lausanne via Coursera Dynamics
Massachusetts Institute of Technology via edX