YoVDO

Large Neural Nets for Amortized Probabilistic Inference in Highly Multimodal Distributions

Offered By: Toronto Machine Learning Series (TMLS) via YouTube

Tags

Neural Networks Courses Machine Learning Courses Reinforcement Learning Courses Bayesian Inference Courses Probabilistic Inference Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the potential of large neural networks for amortized probabilistic inference in highly multimodal distributions and modes in this 33-minute conference talk from the Toronto Machine Learning Series. Delve into the concept of separating world models from inference mechanisms, contrasting it with current large language models that directly fit data. Examine the advantages of amortized probabilistic inference, including quicker run-time inference and improved generalization abilities. Learn about generative flow networks (GFlowNets) as a novel framework for model-based machine learning, and their relationships to reinforcement learning, variational inference, and generative models. Discover recent advances in GFlowNets and their potential applications in incorporating inductive biases inspired by high-level human cognition. Gain insights into building AI systems that focus on understanding the world in a Bayesian and causal way, capable of generating probabilistically truthful statements.

Syllabus

Large Neural Nets for Amortized Probabilistic Inference for Highly Multimodal Distributions and Mode


Taught by

Toronto Machine Learning Series (TMLS)

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX