A Cookbook for Deep Continuous-Time Predictive Models
Offered By: Toronto Machine Learning Series (TMLS) via YouTube
Course Description
Overview
Explore deep continuous-time predictive models for irregularly-sampled, sparse time series in this 47-minute conference talk by David Duvenaud, Assistant Professor at the University of Toronto. Begin with simple feedforward approaches and progress to latent-variable stochastic differential equation models. Learn about regularizing differential equation-based models for improved computational efficiency. Cover topics such as ordinary differential equations, autoregressive continuous-time models, limitations of RNN-based models, latent variable models, ODE latent-variable models, Poisson Process Likelihoods, stochastic transition dynamics, and variational inference. Gain insights into the applications and limitations of various models, including SDEs and Latent ODEs, and understand their relevance in predictive accuracy for datasets like Physionet.
Syllabus
Intro
A Cookbook For Deep Continuous-Time Predictive Models
Motivation: Irregularly-timed datasets
Simplest options
Ordinary Differential Equations
Autoregressive continuous-time
Limitations of RNN-based models
Latent variable models
ODE latent-variable model
Physionet: Predictive accuracy
Poisson Process Likelihoods
Limitations of Latent ODEs
Stochastic transition dynamics
What are SDEs good for?
What is "running an SDE backwards"?
Variational inference
1D Latent SDE
Summary
Related work 1
Related work 2
Taught by
Toronto Machine Learning Series (TMLS)
Related Courses
Topographic VAEs Learn Equivariant Capsules - Machine Learning Research Paper ExplainedYannic Kilcher via YouTube Deep Generative Modeling
Alexander Amini via YouTube Deep Generative Modeling
Alexander Amini via YouTube Deep Generative Modeling
Alexander Amini via YouTube Learning What We Know and Knowing What We Learn - Gaussian Process Priors for Neural Data Analysis
MITCBMM via YouTube