Manifold Learning with Noisy Data: Dimension Reduction and Support Estimation
Offered By: Institut des Hautes Etudes Scientifiques (IHES) via YouTube
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricacies of manifold learning with noisy data in this 43-minute lecture by Elisabeth Gassiat from LMO/Université Paris-Saclay. Delve into a general framework for recovering low-dimensional non-linear structures from high-dimensional data contaminated with significant, unknown noise. Examine minimax rates for support estimation using Hausdorff distance. Cover topics including dimension reduction, manifold learning concepts, geometric approaches to noisy data, additive noise examples, and support estimation through deconvolution with Gaussian noise. Investigate identifiability theorems, geometric conditions for high-dimensional data, and practical examples of supports. Gain valuable insights into estimation upper bounds and leave with a comprehensive understanding of manifold learning in the presence of noisy data.
Syllabus
Intro
Dimension reduction
Manifold learning: some ideas (no noise)
Noisy data. What happens with noise? Geometric ideas
Additive noise: examples
Support estimation: deconvolution with Gaussian noise and (truncated) Hausdorff loss
Robustness to the assumptions on the noise
First question: identifiability
Identifiability theorem
When does HD hold? Simple facts.
When does HD hold? Geometrical condition
When does HD hold? Examples of supports
Second question: estimation (upper bound)
Take-home message
Taught by
Institut des Hautes Etudes Scientifiques (IHES)
Related Courses
Dimensionality Reduction in PythonDataCamp Vector Databases Professional Certificate by Weaviate
LinkedIn Learning Simple Parallel Coordinates Plot using d3 js
Coursera Project Network via Coursera Data Analysis and Visualization
Georgia Institute of Technology via Udacity Attacking Byzantine Robustness in High Dimensions
IEEE via YouTube