The Mother of All Representer Theorems for Inverse Problems and Machine Learning - Michael Unser
Offered By: Alan Turing Institute via YouTube
Course Description
Overview
Syllabus
Intro
Variational formulation of inverse problem
Learning as a (linear) inverse problem
RKHS representer theorem for machine learning
Is there a mother of all representer theorems?
General notion of Banach space
Dual of a Banach space
Riesz conjugate for Hilbert spaces
Generalization: Duality mapping
Properties of duality mapping
Mother of all representer theorems (Cont'd)
Kernel methods for machine learning
Tikhonov regularization (see white board)
Qualitative effect of Banach conjugation
Sparsity promoting regularization
Extreme points
Geometry of 12 vs. l, minimization
Isometry with space of Radon measures
Sparse kernel expansions (Cont'd)
Special case: Translation-invariant kernels
RKHS vs. Sparse kernel expansions (LSI)
Conclusion (Cont'd)
Taught by
Alan Turing Institute
Related Courses
機器學習技法 (Machine Learning Techniques)National Taiwan University via Coursera Utilisez des modèles supervisés non linéaires
CentraleSupélec via OpenClassrooms Statistical Machine Learning
Eberhard Karls University of Tübingen via YouTube Interplay of Linear Algebra, Machine Learning, and HPC - JuliaCon 2021 Keynote
The Julia Programming Language via YouTube Interpolation and Learning With Scale Dependent Kernels
MITCBMM via YouTube