Sampling for Linear Algebra, Statistics, and Optimization I
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the foundations of randomized numerical linear algebra in this lecture from the Foundations of Data Science Boot Camp. Delve into key concepts like element-wise sampling, row/column sampling, and random projections as preconditioners. Learn about approximating matrix multiplication, subspace embeddings, and the importance of leverage and condition in algorithms. Examine meta-algorithms for E-norm and Iz-norm regression, and discover structural results for least-squares approximation. Gain insights into RAM implementations and extensions to low-rank approximation using projections. Presented by Michael Mahoney from the International Computer Science Institute and UC Berkeley, this comprehensive talk provides a deep dive into sampling techniques for linear algebra, statistics, and optimization.
Syllabus
Intro
Outline Background and Overview
RandNLA: Randomized Numerical Linear Algebra
Basic RandNLA Principles
Element-wise Sampling
Row/column Sampling
Random Projections as Preconditioners
Approximating Matrix Multiplication
Subspace Embeddings
Two important notions: leverage and condition
Meta-algorithm for E-norm regression (2 of 3)
Meta-algorithm for Iz-norm regression (3 of 3)
Least-squares approximation: the basic structural result
Least-squares approximation: RAM implementations
Extensions to Low-rank Approximation (Projections)
Taught by
Simons Institute
Related Courses
Data AnalysisJohns Hopkins University via Coursera Computing for Data Analysis
Johns Hopkins University via Coursera Scientific Computing
University of Washington via Coursera Introduction to Data Science
University of Washington via Coursera Web Intelligence and Big Data
Indian Institute of Technology Delhi via Coursera