Generalizing Outside the Training Distribution through Compositional Generation
Offered By: Paul G. Allen School via YouTube
Course Description
Overview
Explore a cutting-edge lecture on compositional generative modeling and its applications in AI generalization. Delve into the concept of energy-based models and their role in enabling compositional generative modeling. Discover how these models can synthesize complex plans for unseen tasks at inference time, pushing the boundaries of AI capabilities. Learn about the application of compositionality to multiple foundation models trained on diverse Internet data, and how this approach enables the construction of decision-making systems capable of hierarchical planning and solving long-horizon problems in a zero-shot manner. Gain insights from Yilun Du, a final year PhD student at MIT CSAIL, as he shares his research spanning machine learning, computer vision, and robotics, with a focus on generative models.
Syllabus
Generalizing Outside the Training Distribution through Compositional Generation: Yilun Du (MIT)
Taught by
Paul G. Allen School
Related Courses
A Path Towards Autonomous Machine Intelligence - Paper ExplainedYannic Kilcher via YouTube Author Interview - VOS- Learning What You Don't Know by Virtual Outlier Synthesis
Yannic Kilcher via YouTube Self-Supervised Learning - The Dark Matter of Intelligence
Yannic Kilcher via YouTube Backpropagation and Deep Learning in the Brain
Simons Institute via YouTube On the Critic Function of Implicit Generative Models - Arthur Gretton
Institute for Advanced Study via YouTube