Sample Complexity of Estimation in Logistic Regression - Lecture
Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube
Course Description
Overview
Explore the intricacies of sample complexity in logistic regression parameter estimation through this comprehensive lecture presented at IPAM's EnCORE Workshop. Delve into the non-asymptotic analysis of sample complexity, examining its relationship with error and inverse temperature in logistic regression models. Discover the three distinct temperature regimes—low, moderate, and high—and understand how they impact the sample complexity curve. Gain insights into the challenges of estimating parameters with a given ℓ2 error, considering factors such as dimension and inverse temperature with standard normal covariates. Compare this approach to traditional generalization bounds and asymptotic performance analyses of maximum-likelihood estimators. Enhance your understanding of binary classification problems and the logistic regression model's role in noisy data generation processes.
Syllabus
Arya Mazumdar - Sample complexity of estimation in logistic regression - IPAM at UCLA
Taught by
Institute for Pure & Applied Mathematics (IPAM)
Related Courses
Beyond Worst-Case Analysis - Panel DiscussionSimons Institute via YouTube Reinforcement Learning - Part I
Simons Institute via YouTube Reinforcement Learning in Feature Space: Complexity and Regret
Simons Institute via YouTube Exploration with Limited Memory - Streaming Algorithms for Coin Tossing, Noisy Comparisons, and Multi-Armed Bandits
Association for Computing Machinery (ACM) via YouTube Optimal Transport for Machine Learning - Gabriel Peyre, Ecole Normale Superieure
Alan Turing Institute via YouTube