The Mother of All Representer Theorems for Inverse Problems and Machine Learning - Michael Unser
Offered By: Alan Turing Institute via YouTube
Course Description
Overview
Syllabus
Intro
Variational formulation of inverse problem
Learning as a (linear) inverse problem
RKHS representer theorem for machine learning
Is there a mother of all representer theorems?
General notion of Banach space
Dual of a Banach space
Riesz conjugate for Hilbert spaces
Generalization: Duality mapping
Properties of duality mapping
Mother of all representer theorems (Cont'd)
Kernel methods for machine learning
Tikhonov regularization (see white board)
Qualitative effect of Banach conjugation
Sparsity promoting regularization
Extreme points
Geometry of 12 vs. l, minimization
Isometry with space of Radon measures
Sparse kernel expansions (Cont'd)
Special case: Translation-invariant kernels
RKHS vs. Sparse kernel expansions (LSI)
Conclusion (Cont'd)
Taught by
Alan Turing Institute
Related Courses
Introduction to Bayesian StatisticsDatabricks via Coursera Probability via Computation - Week 4
The Julia Programming Language via YouTube On the Unreasonable Effectiveness of Compressive Imaging - Ben Adcock, Simon Fraser University
Alan Turing Institute via YouTube Affine Spline Insights into Deep Learning - Richard Baraniuk, Rice University
Alan Turing Institute via YouTube The Future of Crime Detection and Prevention
The Royal Institution via YouTube