YoVDO

An ℓp Theory of PCA and Spectral Clustering

Offered By: BIMSA via YouTube

Tags

Principal Component Analysis Courses Statistics & Probability Courses Machine Learning Courses Linear Algebra Courses Hilbert Spaces Courses Eigenvectors Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore an advanced lecture on Principal Component Analysis (PCA) and spectral clustering presented by Kaizheng Wang at the ICBS2024 conference. Delve into a novel $\ell_p$ perturbation theory for a hollowed version of PCA in Hilbert spaces, designed to improve upon traditional PCA methods when dealing with heteroscedastic noises. Examine the entrywise behaviors of principal component score vectors and their approximation by linear functionals of the Gram matrix in $\ell_p$ norm. Investigate how the choice of $p$ affects optimal bounds in sub-Gaussian mixture models, leading to optimality guarantees for spectral clustering. Discover how this $\ell_p$ theory applies to contextual community detection, resulting in simple spectral algorithms that achieve the information threshold for exact recovery and optimal misclassification rates. Gain insights into this cutting-edge research that bridges statistical theory with practical applications in machine learning and data analysis over the course of 49 minutes.

Syllabus

Kaizheng Wang: An $\ell_{p}$ theory of PCA and spectral clustering #ICBS2024


Taught by

BIMSA

Related Courses

Algèbre Linéaire (Partie 2)
École Polytechnique Fédérale de Lausanne via edX
Algèbre Linéaire (Partie 3)
École Polytechnique Fédérale de Lausanne via edX
Mécanique Lagrangienne
École Polytechnique Fédérale de Lausanne via Coursera
Eigenvectors and Eigenvalues
Udacity
Differential Equations: Linear Algebra and NxN Systems of Differential Equations
Massachusetts Institute of Technology via edX