Data Compression I - Lecture 3: Kraft Inequality, Entropy, and Introduction to SCL
Offered By: Stanford University via YouTube
Course Description
Overview
Explore the fundamental concepts of data compression in this lecture from Stanford University's EE274 course. Delve into the Kraft Inequality, a crucial theorem in information theory that relates codeword lengths to probability distributions. Examine the concept of entropy, which quantifies the average amount of information in a message. Get an introduction to Source Coding with Large Alphabets (SCL), a technique for efficiently compressing data with large symbol sets. Follow along with Professor Tsachy Weissman, Shubham Chandak, and Pulkit Tandon as they guide you through these essential topics in data compression theory and applications. Access additional course materials and enrollment information through the provided links to enhance your learning experience in this comprehensive electrical engineering program.
Syllabus
Stanford EE274: Data Compression I 2023 I Lecture 3 - Kraft Inequality, Entropy, Introduction to SCL
Taught by
Stanford Online
Tags
Related Courses
Statistical Molecular ThermodynamicsUniversity of Minnesota via Coursera Thermodynamics
Indian Institute of Technology Bombay via edX Introduzione alla fisica sperimentale: meccanica, termodinamica
Politecnico di Milano via Polimi OPEN KNOWLEDGE Statistical Thermodynamics: Molecules to Machines
Carnegie Mellon University via Coursera Engineering Thermodynamics
Indian Institute of Technology Kanpur via Swayam