Complexity of Sparse Linear Regression
Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube
Course Description
Overview
Explore the complexities of sparse linear regression in this 54-minute lecture presented by Raghu Meka at IPAM's EnCORE Workshop. Delve into the challenges of statistical sample complexity and algorithmic efficiency, particularly with Gaussian covariates. Discover new methods that overcome limitations of traditional approaches like Lasso and basis pursuit, especially when dealing with ill-conditioned covariance matrices. Learn about innovative solutions for cases involving low treewidth dependency graphs, few bad correlations, or correlations arising from few latent variables. Examine the limitations of broad algorithm classes in this field. Gain insights from joint research with Jon Kelner, Frederic Koehler, and Dhruv Rohatgi in this comprehensive exploration of sparse linear regression's fundamental role in signal processing, statistics, and machine learning.
Syllabus
Raghu Meka - Complexity of Sparse Linear Regression - IPAM at UCLA
Taught by
Institute for Pure & Applied Mathematics (IPAM)
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Natural Language Processing
Columbia University via Coursera Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent