Overview of Statistical Learning Theory - Part 2
Offered By: Simons Institute via YouTube
Course Description
Overview
Delve into the second part of a comprehensive tutorial on statistical learning theory presented by Nati Srebro from the Toyota Technological Institute at Chicago. Explore key concepts in 20th-century statistical learning theory, focusing on generalization through capacity control. Examine Vapnik and Chervonenkis's Fundamental Theorem of Learning, scale-sensitive capacity control and marking, and Minimum Description Length principles. Investigate parallels with stochastic optimization and explore generalization and capacity control from an optimization perspective, including online-to-batch conversion, stochastic approximation, boosting, and min norm and max margin concepts. Evaluate how classic theory aligns with current interests such as interpolation learning, benign overfitting, and implicit bias. Gain valuable insights into the foundations and modern applications of statistical learning theory in this hour-long lecture from the Modern Paradigms in Generalization Boot Camp at the Simons Institute.
Syllabus
Overview of Statistical Learning Theory Part 2
Taught by
Simons Institute
Related Courses
Launching into Machine Learning 日本語版Google Cloud via Coursera Launching into Machine Learning auf Deutsch
Google Cloud via Coursera Launching into Machine Learning en Français
Google Cloud via Coursera Launching into Machine Learning en Español
Google Cloud via Coursera Основы машинного обучения
Higher School of Economics via Coursera