Sampling for Linear Algebra, Statistics, and Optimization I
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the foundations of randomized numerical linear algebra in this lecture from the Foundations of Data Science Boot Camp. Delve into key concepts like element-wise sampling, row/column sampling, and random projections as preconditioners. Learn about approximating matrix multiplication, subspace embeddings, and the importance of leverage and condition in algorithms. Examine meta-algorithms for E-norm and Iz-norm regression, and discover structural results for least-squares approximation. Gain insights into RAM implementations and extensions to low-rank approximation using projections. Presented by Michael Mahoney from the International Computer Science Institute and UC Berkeley, this comprehensive talk provides a deep dive into sampling techniques for linear algebra, statistics, and optimization.
Syllabus
Intro
Outline Background and Overview
RandNLA: Randomized Numerical Linear Algebra
Basic RandNLA Principles
Element-wise Sampling
Row/column Sampling
Random Projections as Preconditioners
Approximating Matrix Multiplication
Subspace Embeddings
Two important notions: leverage and condition
Meta-algorithm for E-norm regression (2 of 3)
Meta-algorithm for Iz-norm regression (3 of 3)
Least-squares approximation: the basic structural result
Least-squares approximation: RAM implementations
Extensions to Low-rank Approximation (Projections)
Taught by
Simons Institute
Related Courses
Quantum-Inspired Classical Linear AlgebraSimons Institute via YouTube Foundations of Data Science II
Simons Institute via YouTube Near Optimal Linear Algebra in the Online and Sliding Window Models
IEEE via YouTube Low Rank Approximation in Electron Excitation Calculations - IPAM at UCLA
Institute for Pure & Applied Mathematics (IPAM) via YouTube Learning-Based Low-Rank Approximations - IPAM at UCLA
Institute for Pure & Applied Mathematics (IPAM) via YouTube