Exploiting Sparsity and Structure in Parametric and Nonparametric Estimation - 2007
Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube
Course Description
Overview
Explore new findings on sparse estimation in parametric graphical models and nonparametric regression in high dimensions in this 1-hour 5-minute lecture by John Lafferty from Carnegie Mellon University. Delve into l1 regularization techniques for estimating graph structures in high-dimensional settings and discover a novel nonparametric lasso method that regularizes estimator derivatives. Examine the challenges of semi-supervised learning and how unlabeled data can potentially enhance estimation. Analyze current regularization methods through the lens of minimax theory and learn about new approaches that yield improved convergence rates. Gain insights from Lafferty's extensive background in machine learning, statistical learning theory, computational statistics, and natural language processing as he presents joint work with collaborators in this Center for Language & Speech Processing talk at Johns Hopkins University.
Syllabus
Exploiting Sparsity and Structure in Parametric and Nonparametric Estimation – John Lafferty - 2007
Taught by
Center for Language & Speech Processing(CLSP), JHU
Related Courses
機械学習・深層学習 (ga120)Waseda University via gacco What are GAN's actually- from underlying math to python code
Udemy Artificial Intelligence Foundations: Machine Learning
LinkedIn Learning HyperTransformer - Model Generation for Supervised and Semi-Supervised Few-Shot Learning
Yannic Kilcher via YouTube Big Self-Supervised Models Are Strong Semi-Supervised Learners
Yannic Kilcher via YouTube