Optimal Sampling in Weighted Least-Squares Methods - Application to High-Dimensional Approximation
Offered By: Alan Turing Institute via YouTube
Course Description
Overview
Explore optimal sampling techniques in weighted least-squares methods and their application to high-dimensional approximation in this 48-minute lecture from the Alan Turing Institute. Delve into the challenges of reconstructing complex processes with numerous parameters, addressing the curse of dimensionality in function approximation. Learn about modern approaches that overcome limitations by leveraging structural assumptions like low intrinsic dimensionality, partial separability, and sparse representations. Examine the mathematical foundations of high-dimensional approximation, covering topics such as multivariate approximation theory, high-dimensional integration, and non-parametric regression. Gain insights into key concepts including least-square methods, deterministic counterparts, noise levels, concentration inequalities, stability regimes, and adaptive methods. Discover how these techniques contribute to solving common problems in science and engineering involving complex, parameter-dependent processes.
Syllabus
Introduction
Highdimensional approximation
Common problems
Leastsquare method
General question
Deterministic counterpart
Goal
Key ingredient
Noise level
Concentration inequality
Stability regime
Highdimensional PDS
Curse of dimensionality
Adaptive method
References
Taught by
Alan Turing Institute
Related Courses
Dimensionality Reduction in PythonDataCamp Sparse Nonlinear Dynamics Models with SINDy - The Library of Candidate Nonlinearities
Steve Brunton via YouTube Overcoming the Curse of Dimensionality and Mode Collapse - Ke Li
Institute for Advanced Study via YouTube Emergent Linguistic Structure in Deep Contextual Neural Word Representations - Chris Manning
Institute for Advanced Study via YouTube Multilevel Weighted Least Squares Polynomial Approximation – Sören Wolfers, KAUST
Alan Turing Institute via YouTube