Convergence of Nearest Neighbor Classification - Sanjoy Dasgupta
Offered By: Institute for Advanced Study via YouTube
Course Description
Overview
Explore the convergence of nearest neighbor classification in this 49-minute Members' Seminar presented by Sanjoy Dasgupta from the University of California, San Diego. Delve into the nonparametric estimator, statistical learning theory setup, and consistency results under continuity. Examine universal consistency in RPA and metric spaces, smoothness and margin conditions, and accurate rates of convergence. Investigate tradeoffs in choosing k, adaptive NN classifiers, and nonparametric notions of margin. Conclude with open problems in the field of nearest neighbor classification.
Syllabus
Intro
Nearest neighbor
A nonparametric estimator
The data space
Statistical learning theory setup
Questions of interest
Consistency results under continuity
Universal consistency in RP
A key geometric fact
Universal consistency in metric spaces
Smoothness and margin conditions
A better smoothness condition for NN
Accurate rates of convergence under smoothness
Under the hood
Tradeoffs in choosing k
An adaptive NN classifier
A nonparametric notion of margin
Open problems
Taught by
Institute for Advanced Study
Related Courses
Statistical Machine LearningEberhard Karls University of Tübingen via YouTube The Information Bottleneck Theory of Deep Neural Networks
Simons Institute via YouTube Interpolation and Learning With Scale Dependent Kernels
MITCBMM via YouTube Statistical Learning Theory and Applications - Class 16
MITCBMM via YouTube Statistical Learning Theory and Applications - Class 6
MITCBMM via YouTube