YoVDO

Regularized Least Squares

Offered By: MITCBMM via YouTube

Tags

Statistical Learning Theory Courses Gradient Descent Courses Quadratic Programming Courses Loss Functions Courses

Course Description

Overview

Explore regularized least squares in this comprehensive lecture by Lorenzo Rosasco from MIT, University of Genoa, and IIT. Delve into key concepts including loss functions, optimality conditions, quadratic programming, and gradient descent. Learn about the class of numbers, super vectors, and perception as part of the 9.520/6.860S Statistical Learning Theory and Applications course. Gain valuable insights into statistical learning theory and its practical applications over the course of 80 minutes.

Syllabus

Introduction
Loss function
Optimality Condition
Quadratic Programming
Class of Numbers
SuperVectors
Perception
Gradient Descent


Taught by

MITCBMM

Related Courses

Statistical Machine Learning
Eberhard Karls University of Tübingen via YouTube
The Information Bottleneck Theory of Deep Neural Networks
Simons Institute via YouTube
Interpolation and Learning With Scale Dependent Kernels
MITCBMM via YouTube
Statistical Learning Theory and Applications - Class 16
MITCBMM via YouTube
Statistical Learning Theory and Applications - Class 6
MITCBMM via YouTube