YoVDO

Predicting the Rules Behind - Deep Symbolic Regression for Recurrent Sequences

Offered By: Yannic Kilcher via YouTube

Tags

Machine Learning Courses Transformers Courses Deep Neural Networks Courses

Course Description

Overview

Explore deep symbolic regression for recurrent sequences in this comprehensive video featuring an interview with first author Stéphane d'Ascoli. Delve into the innovative use of transformers for symbolic computation on integer and floating point number sequences. Learn about the clever encoding of input space, data generation process, and the model's ability to predict and represent sequences from the Online Encyclopedia of Integer Sequences (OEIS). Discover the potential applications beyond number sequences, success and failure cases, and experimental results. Gain insights into overcoming research challenges and interact with the provided demo to experience the model's capabilities firsthand.

Syllabus

- Introduction
- Summary of the Paper
- Start of Interview
- Why this research direction?
- Overview of the method
- Embedding space of input tokens
- Data generation process
- Why are transformers useful here?
- Beyond number sequences, where is this useful?
- Success cases and failure cases
- Experimental Results
- How did you overcome difficulties?
- Interactive demo


Taught by

Yannic Kilcher

Related Courses

Linear Circuits
Georgia Institute of Technology via Coursera
مقدمة في هندسة الطاقة والقوى
King Abdulaziz University via Rwaq (رواق)
Magnetic Materials and Devices
Massachusetts Institute of Technology via edX
Linear Circuits 2: AC Analysis
Georgia Institute of Technology via Coursera
Transmisión de energía eléctrica
Tecnológico de Monterrey via edX