YoVDO

Learning Mixtures of Linear Regressions in Subexponential Time via Fourier Moments

Offered By: Association for Computing Machinery (ACM) via YouTube

Tags

Machine Learning Courses Linear Systems Courses Computational Mathematics Courses Mixture-of-Experts Courses

Course Description

Overview

Explore a groundbreaking algorithm for learning mixtures of linear regressions in subexponential time using Fourier moments in this 25-minute conference talk. Delve into solving multiple linear and Gaussian systems, examine mixture-of-experts and mixture models, and analyze previous results in the field. Investigate the potential barrier at exp(k) and discover the innovative approach that overcomes it. Learn about techniques for identifying individual components, understanding the minimum variance of Gaussian mixtures, and grasping the significance of exp(VR) in the algorithm's complexity. Gain insights into methods for learning all components simultaneously and consider open questions for future research in this cutting-edge area of machine learning and statistical analysis.

Syllabus

SOLVING MANY LINEAR SYSTEMS
SOLVING MANY GAUSSIAN LINEAR SYSTEMS
MIXTURE-OF-EXPERTS
MIXTURE MODELS
PREVIOUS RESULTS
A BARRIER AT exp(k)?
OUR RESULTS
LEARNING ONE COMPONENT
MIN VARIANCE OF A GAUSSIAN MIXTURE
WHERE DOES exp(VR) COME FROM?
LEARNING ALL COMPONENTS
OPEN QUESTIONS


Taught by

Association for Computing Machinery (ACM)

Related Courses

GShard- Scaling Giant Models with Conditional Computation and Automatic Sharding
Yannic Kilcher via YouTube
Modules and Architectures
Alfredo Canziani via YouTube
Stanford Seminar - Mixture of Experts Paradigm and the Switch Transformer
Stanford University via YouTube
Decoding Mistral AI's Large Language Models - Building Blocks and Training Strategies
Databricks via YouTube
Pioneering a Hybrid SSM Transformer Architecture - Jamba Foundation Model
Databricks via YouTube