YoVDO

Large Neural Nets for Amortized Probabilistic Inference in Highly Multimodal Distributions

Offered By: Toronto Machine Learning Series (TMLS) via YouTube

Tags

Neural Networks Courses Machine Learning Courses Reinforcement Learning Courses Bayesian Inference Courses Probabilistic Inference Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the potential of large neural networks for amortized probabilistic inference in highly multimodal distributions and modes in this 33-minute conference talk from the Toronto Machine Learning Series. Delve into the concept of separating world models from inference mechanisms, contrasting it with current large language models that directly fit data. Examine the advantages of amortized probabilistic inference, including quicker run-time inference and improved generalization abilities. Learn about generative flow networks (GFlowNets) as a novel framework for model-based machine learning, and their relationships to reinforcement learning, variational inference, and generative models. Discover recent advances in GFlowNets and their potential applications in incorporating inductive biases inspired by high-level human cognition. Gain insights into building AI systems that focus on understanding the world in a Bayesian and causal way, capable of generating probabilistically truthful statements.

Syllabus

Large Neural Nets for Amortized Probabilistic Inference for Highly Multimodal Distributions and Mode


Taught by

Toronto Machine Learning Series (TMLS)

Related Courses

Computational Neuroscience
University of Washington via Coursera
Reinforcement Learning
Brown University via Udacity
Reinforcement Learning
Indian Institute of Technology Madras via Swayam
FA17: Machine Learning
Georgia Institute of Technology via edX
Introduction to Reinforcement Learning
Higher School of Economics via Coursera