YoVDO

Stanford EE274: Data Compression - Beyond IID Distributions: Conditional Entropy - Lecture 8

Offered By: Stanford University via YouTube

Tags

Information Theory Courses Signal Processing Courses Algorithms Courses Statistical Analysis Courses Probability Theory Courses Coding Theory Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the concept of conditional entropy and its applications in data compression beyond independent and identically distributed (IID) distributions in this lecture from Stanford University's EE274: Data Compression I course. Delve into advanced topics presented by Professor Tsachy Weissman, along with insights from Shubham Chandak and Pulkit Tandon. Gain a deeper understanding of how conditional entropy extends compression techniques to more complex data structures. Access the course website for supplementary materials and follow along with the comprehensive discussion. For those interested in pursuing the full online course, information on enrollment is available through Stanford's online learning platform.

Syllabus

Stanford EE274: Data Compression I 2023 I Lecture 8 - Beyond IID distributions: Conditional entropy


Taught by

Stanford Online

Tags

Related Courses

Introduction to Statistics: Probability
University of California, Berkeley via edX
Aléatoire : une introduction aux probabilités - Partie 1
École Polytechnique via Coursera
Einführung in die Wahrscheinlichkeitstheorie
Johannes Gutenberg University Mainz via iversity
Combinatorics and Probability
Moscow Institute of Physics and Technology via Coursera
Probability
University of Pennsylvania via Coursera