YoVDO

Data Compression I - Lecture 3: Kraft Inequality, Entropy, and Introduction to SCL

Offered By: Stanford University via YouTube

Tags

Information Theory Courses Electrical Engineering Courses Entropy Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the fundamental concepts of data compression in this lecture from Stanford University's EE274 course. Delve into the Kraft Inequality, a crucial theorem in information theory that relates codeword lengths to probability distributions. Examine the concept of entropy, which quantifies the average amount of information in a message. Get an introduction to Source Coding with Large Alphabets (SCL), a technique for efficiently compressing data with large symbol sets. Follow along with Professor Tsachy Weissman, Shubham Chandak, and Pulkit Tandon as they guide you through these essential topics in data compression theory and applications. Access additional course materials and enrollment information through the provided links to enhance your learning experience in this comprehensive electrical engineering program.

Syllabus

Stanford EE274: Data Compression I 2023 I Lecture 3 - Kraft Inequality, Entropy, Introduction to SCL


Taught by

Stanford Online

Tags

Related Courses

Information Theory
The Chinese University of Hong Kong via Coursera
Fundamentals of Electrical Engineering
Rice University via Coursera
Digital Signal Processing
École Polytechnique Fédérale de Lausanne via Coursera
Circuits and Electronics 1: Basic Circuit Analysis
Massachusetts Institute of Technology via edX
Solar: Solar Cells, Fuel Cells and Batteries
Stanford University via Stanford OpenEdx