YoVDO

Deep Learning for Symbolic Mathematics

Offered By: Yannic Kilcher via YouTube

Tags

Neural Networks Courses Deep Learning Courses MATLAB Courses Sequence to Sequence Models Courses

Course Description

Overview

Explore a groundbreaking video that delves into the application of neural networks for solving complex mathematical problems like symbolic integration and differential equations. Learn about a novel syntax for representing mathematical problems and methods for generating large datasets to train sequence-to-sequence models. Discover how this approach outperforms commercial Computer Algebra Systems such as Matlab and Mathematica. Examine the paper by Guillaume Lample and François Charton, which challenges the notion that neural networks are limited to statistical or approximate problems. Gain insights into the use of Reverse Polish Notation and understand the intricacies of the model's functionality. Follow along as the video breaks down the process of integration and discusses important caveats in this innovative approach to symbolic mathematics.

Syllabus

Intro
Paper
How they did it
Reverse Polish Notation
How it works
Integration
Caveat


Taught by

Yannic Kilcher

Related Courses

Attention Mechanism - Deutsch
Google Cloud via Coursera
Gen AI Foundational Models for NLP & Language Understanding
IBM via Coursera
Introduction to Attention-Based Neural Networks
LinkedIn Learning
Encoder-Decoder Architecture
Pluralsight
Building and Deploying Keras Models in a Multi-cloud Environment
Pluralsight