YoVDO

Bayesian Inference in Generative Models

Offered By: MITCBMM via YouTube

Tags

Bayesian Statistics Courses Neural Networks Courses Neuroscience Courses Cognitive Sciences Courses Bayesian Inference Courses Generative Models Courses

Course Description

Overview

Explore Bayesian inference in generative models through this comprehensive 50-minute tutorial by Luke Hewitt from MIT. Delve into various approximate inference methods, including sampling-based techniques like MCMC and particle filters, as well as variational inference. Learn how neural networks can enhance these methods and discover the power of probabilistic programming languages for black-box Bayesian inference in complex models. Engage in hands-on exercises to implement inference algorithms for simple models and explore complex models using probabilistic programming languages. Access additional resources, including slides, references, and exercises, to further enhance your understanding of topics such as exact inference, Monte Carlo methods, gradient descent, normalizing flows, and more.

Syllabus

Introduction
Exact Inference
Monte Carlo Methods
Markov Chain Monte Carlo
MTM
variational inference
gradient descent
normalizing flows
variational methods
probabilistic programming languages
summary


Taught by

MITCBMM

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX