Stanford EE274: Data Compression - Beyond IID Distributions: Conditional Entropy - Lecture 8
Offered By: Stanford University via YouTube
Course Description
Overview
Explore the concept of conditional entropy and its applications in data compression beyond independent and identically distributed (IID) distributions in this lecture from Stanford University's EE274: Data Compression I course. Delve into advanced topics presented by Professor Tsachy Weissman, along with insights from Shubham Chandak and Pulkit Tandon. Gain a deeper understanding of how conditional entropy extends compression techniques to more complex data structures. Access the course website for supplementary materials and follow along with the comprehensive discussion. For those interested in pursuing the full online course, information on enrollment is available through Stanford's online learning platform.
Syllabus
Stanford EE274: Data Compression I 2023 I Lecture 8 - Beyond IID distributions: Conditional entropy
Taught by
Stanford Online
Tags
Related Courses
Information TheoryThe Chinese University of Hong Kong via Coursera Intro to Computer Science
University of Virginia via Udacity Analytic Combinatorics, Part I
Princeton University via Coursera Algorithms, Part I
Princeton University via Coursera Divide and Conquer, Sorting and Searching, and Randomized Algorithms
Stanford University via Coursera