Priors for Semantic Variables - Yoshua Bengio
Offered By: Institute for Advanced Study via YouTube
Course Description
Overview
Explore the frontiers of machine learning in this seminar on Theoretical Machine Learning, featuring renowned researcher Yoshua Bengio from Université de Montréal. Delve into the concept of priors for semantic variables and examine the limitations of current machine learning approaches. Gain insights into systematization, learning theory, and the role of conscious processing in AI. Investigate the differences between System 1 and System 2 thinking, and explore knowledge representation, attention mechanisms, and the Global Workspace Theory. Discover the importance of causality, the Independent Mechanism Hypothesis, and localized changes in AI systems. Learn about parameterization, multitask learning, and modular recurrent networks as Bengio shares cutting-edge ideas on advancing machine learning capabilities.
Syllabus
Introduction
Limitations of machine learning
Systematization
Learning theory
Conscious processing
Agency
System 1 vs System 2
The Kind of Knowledge
Knowledge Representation
Attention
Recurrent Independent Mechanisms
Global Workspace Theory
Attention Mechanisms
Causality
Independent Mechanism Hypothesis
Localized Changes
Parameterization
Multitask learning
Modular recurrent net
Taught by
Institute for Advanced Study
Related Courses
Latent State Recovery in Reinforcement Learning - John LangfordInstitute for Advanced Study via YouTube On the Critic Function of Implicit Generative Models - Arthur Gretton
Institute for Advanced Study via YouTube Instance-Hiding Schemes for Private Distributed Learning
Institute for Advanced Study via YouTube Learning Probability Distributions - What Can, What Can't Be Done - Shai Ben-David
Institute for Advanced Study via YouTube Latent Stochastic Differential Equations for Irregularly-Sampled Time Series - David Duvenaud
Institute for Advanced Study via YouTube