YoVDO

Information Theory

Offered By: NPTEL via Swayam

Tags

Mathematics Courses Statistics & Probability Courses Machine Learning Courses Information Theory Courses Probability Courses Theoretical Computer Science Courses Entropy Courses Mutual Information Courses KL Divergence Courses

Course Description

Overview

This is a graduate level introductory course in Information Theory where we will introduce the mathematical notion of information and justify it by various operational meanings. This basic theory builds on probability theory and allows us to quantitatively measure the uncertainty and randomnessin a random variable as well as information revealed on observing its value. We will encounter quantities such as entropy, mutual information, total variation distance, and KL divergence and explain how they play a role in important problems in communication, statistics, and computer science.Information theory was originally invented as a mathematical theory of communication, but has since found applications in many areas ranging from physics to biology. In fact, any field where people want to evaluate how much information about an unknown is revealed by a particular experiment, information theory can help. In this course, we will lay down the foundations of this fundamental field.
INTENDED AUDIENCE :
Senior undergraduate and graduate students interested in probability, statistics, communication, theoretical computer science, machine learning, quantum information and statistical physics
PREREQUISITES :Undergraduate level probability (sets and events, probability distributions, probability density functions, probability mass functions, random variables, expected value, variance, popular probability laws, Markov inequality, Chebyshev in equality, central limit theorem, law of large numbers) INDUSTRIES SUPPORT :None

Syllabus

COURSE LAYOUT Week 1:Introduction to entropy as a measure of uncertainty and randomness
Week 2:Binary hypothesis testing: bayes optimal binary hypothesis testing and total variation distance, Neyman-Pearson formulation, Stein's lemma,and KL divergence Week 3:Measures of information and their properties: Chain rule and additivity,concavity, and variational formulae Week 4:Data processing inequality, Pinsker's inequality, and Fano's inequality
Week 5:Data compression: _xed and variable length source coding theoremsand entropy Week 6:Hu_man code, Shannon-Fano-Elias code, arithmetic code, hash tables Week 7:Universal compression Week 8:Channel coding: Channel capacity theorem, sphere packing bound,maximal code construction
Week 9:Random coding and ML decoding Week 10:PLDPC and Polar codes Week 11:Quantization Week 12:Minmax lower bounds in statistics

Taught by

Prof. Himanshu Tyagi

Tags

Related Courses

4.0 Shades of Digitalisation for the Chemical and Process Industries
University of Padova via FutureLearn
A Day in the Life of a Data Engineer
Amazon Web Services via AWS Skill Builder
FinTech for Finance and Business Leaders
ACCA via edX
Accounting Data Analytics
University of Illinois at Urbana-Champaign via Coursera
Accounting Data Analytics
Coursera