YoVDO

Stanford EE274: Data Compression - Beyond IID Distributions: Conditional Entropy - Lecture 8

Offered By: Stanford University via YouTube

Tags

Information Theory Courses Signal Processing Courses Algorithms Courses Statistical Analysis Courses Probability Theory Courses Coding Theory Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the concept of conditional entropy and its applications in data compression beyond independent and identically distributed (IID) distributions in this lecture from Stanford University's EE274: Data Compression I course. Delve into advanced topics presented by Professor Tsachy Weissman, along with insights from Shubham Chandak and Pulkit Tandon. Gain a deeper understanding of how conditional entropy extends compression techniques to more complex data structures. Access the course website for supplementary materials and follow along with the comprehensive discussion. For those interested in pursuing the full online course, information on enrollment is available through Stanford's online learning platform.

Syllabus

Stanford EE274: Data Compression I 2023 I Lecture 8 - Beyond IID distributions: Conditional entropy


Taught by

Stanford Online

Tags

Related Courses

Information Theory
The Chinese University of Hong Kong via Coursera
Intro to Computer Science
University of Virginia via Udacity
Analytic Combinatorics, Part I
Princeton University via Coursera
Algorithms, Part I
Princeton University via Coursera
Divide and Conquer, Sorting and Searching, and Randomized Algorithms
Stanford University via Coursera