YoVDO

Prediction, Generalization, and Complexity: Classical Statistical Theory Revisited - Part 1

Offered By: Simons Institute via YouTube

Tags

Machine Learning Courses Interpolation Courses Optimism Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a comprehensive lecture on classical statistical decision theory and its application to modern machine learning paradigms. Delve into the relationship between prediction error, generalization gap, and model complexity from a fixed-X perspective in statistics. Examine the insights this approach offers and its limitations when applied to random-X settings common in machine learning. Discover how classical statistical concepts can be reinterpreted and extended to address the challenges of flexible models that interpolate training data. Gain valuable knowledge on bridging the gap between traditional statistical methods and contemporary machine learning approaches in this 1-hour 18-minute talk by Ryan Tibshirani from the University of California, Berkeley, presented at the Simons Institute's Modern Paradigms in Generalization Boot Camp.

Syllabus

Prediction, Generalization, Complexity: Revisiting the Classical View from Statistics Part 1


Taught by

Simons Institute

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent