Hypothesis Selection with Computational Constraints
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the fundamental problem of hypothesis selection in learning theory and statistics through this 32-minute lecture by Maryam Aliakbarpour from Rice University. Delve into the challenge of selecting a distribution that best matches given data from a finite set of candidate distributions. Learn about the goal of designing algorithms that output a distribution with minimal total variation distance from an unknown distribution. Examine the study of hypothesis selection under memory and time constraints, focusing on achieving optimal tradeoffs between memory usage and sample complexity. Discover methods for obtaining optimal accuracy using algorithms with nearly optimal time complexity in this Simons Institute presentation on extroverted sublinear algorithms.
Syllabus
Hypothesis selection with computational constraints
Taught by
Simons Institute
Related Courses
Beyond Worst-Case Analysis - Panel DiscussionSimons Institute via YouTube Reinforcement Learning - Part I
Simons Institute via YouTube Reinforcement Learning in Feature Space: Complexity and Regret
Simons Institute via YouTube Exploration with Limited Memory - Streaming Algorithms for Coin Tossing, Noisy Comparisons, and Multi-Armed Bandits
Association for Computing Machinery (ACM) via YouTube Optimal Transport for Machine Learning - Gabriel Peyre, Ecole Normale Superieure
Alan Turing Institute via YouTube