Manifold Learning with Noisy Data: Dimension Reduction and Support Estimation
Offered By: Institut des Hautes Etudes Scientifiques (IHES) via YouTube
Course Description
Overview
Explore the intricacies of manifold learning with noisy data in this 43-minute lecture by Elisabeth Gassiat from LMO/Université Paris-Saclay. Delve into a general framework for recovering low-dimensional non-linear structures from high-dimensional data contaminated with significant, unknown noise. Examine minimax rates for support estimation using Hausdorff distance. Cover topics including dimension reduction, manifold learning concepts, geometric approaches to noisy data, additive noise examples, and support estimation through deconvolution with Gaussian noise. Investigate identifiability theorems, geometric conditions for high-dimensional data, and practical examples of supports. Gain valuable insights into estimation upper bounds and leave with a comprehensive understanding of manifold learning in the presence of noisy data.
Syllabus
Intro
Dimension reduction
Manifold learning: some ideas (no noise)
Noisy data. What happens with noise? Geometric ideas
Additive noise: examples
Support estimation: deconvolution with Gaussian noise and (truncated) Hausdorff loss
Robustness to the assumptions on the noise
First question: identifiability
Identifiability theorem
When does HD hold? Simple facts.
When does HD hold? Geometrical condition
When does HD hold? Examples of supports
Second question: estimation (upper bound)
Take-home message
Taught by
Institut des Hautes Etudes Scientifiques (IHES)
Related Courses
Data Analysis and VisualizationGeorgia Institute of Technology via Udacity Dimensionality Reduction in Python
DataCamp Deep Learning of Dynamics and Coordinates with SINDy Autoencoders
Steve Brunton via YouTube Recent Developments in Supervised Learning With Noise
Simons Institute via YouTube PCA for High-Dimensional Heteroscedastic Data
Fields Institute via YouTube