YoVDO

Concentration Inequalities

Offered By: Indian Institute of Science Bangalore via Swayam

Tags

Statistics & Probability Courses Concentration Inequalities Courses Poincaré inequality Courses

Course Description

Overview

It is well-known that functions of large numbers of random quantities tend to behave rather predictably and ‘less randomly’ than their constituents. For instance, the laws of large numbers tell us that the average of many independent random variables is asymptotically the expected value; higher-order refinements such as the central limit theorem and large deviations techniques uncover the asymptotic rate at which this reduction in randomness takes place. However, if one is interested in sharper estimates, for the probability of deviation from the typical value, for a fixed number of observations, for functions other than the average, or for functions of dependent random variables, one must take recourse to more specific measure concentration bounds. Perhaps the most basic, nontrivial examples in this regard are the Markov and Chebyshev inequalities, which are encountered in a first course on probability. This graduate-level course on concentration inequalities will cover the basic material on this classic topic as well as introduce several advanced topics and techniques. The utility of the inequalities derived will be illustrated by drawing on applications from electrical engineering, computer science and statistics. A tentative list of topics is given below. 1. Introduction & motivation: Limit results and concentration bounds 2. Chernoff bounds: Hoeffding’s inequality, Bennett’s inequality, Bernstein’s inequality 3. Variance bounds: Efron-Stein inequality, Poincáre inequality 4. The entropy method and log Sobolev inequality 5. The transportation method 6. Isoperimetric inequalities 7. Other special topics PREREQUISITES : A course on either probability, random processes or measure theory. Basic mathematical maturity and working familiarity with probability calculations.

Syllabus

Week 1: Chernoff bounds Week 2: Concentration bounds for sums and other functions of independent random variables Week 3: Variance bounds for functions of independent random variables Week 4: The Entropy method for concentration inequalities Week 5: Entropy method (contd.) and Transportation method Week 6: Transportation method, isoperimetry and concentration Week 7: Log-Sobolev inequalities revisited Week 8: Concentration inequalities for sequential data

Taught by

Prof. Himanshu Tyagi, Prof. Aditya gopalan

Tags

Related Courses

Accounting for Death in War: Separating Fact from Fiction
Royal Holloway, University of London via FutureLearn
Advanced Machine Learning
The Open University via FutureLearn
Advanced Statistics for Data Science
Johns Hopkins University via Coursera
農企業管理學 (Agribusiness Management)
National Taiwan University via Coursera
AI & Machine Learning
Arizona State University via Coursera