YoVDO

Huffman Codes - An Information Theory Perspective

Offered By: Reducible via YouTube

Tags

Algorithms and Data Structures Courses Information Theory Courses Entropy Courses

Course Description

Overview

Dive into the fascinating world of data compression with this comprehensive video on Huffman Codes from an information theory perspective. Explore the evolution of compression algorithms, starting from the foundational concepts of information theory to the groundbreaking discovery of Huffman Codes. Learn about modeling data compression problems, measuring information, self-information, and entropy. Understand the crucial connection between entropy and compression, and discover how Shannon-Fano coding paved the way for Huffman's improvement. Examine practical Huffman Coding examples and implementation techniques. Gain insights into the elegant simplicity of the Huffman algorithm and its significance in the field of data compression. Perfect for those interested in information theory, computer science, and the history of algorithmic breakthroughs.

Syllabus

Intro
Modeling Data Compression Problems
Measuring Information
Self-Information and Entropy
The Connection between Entropy and Compression
Shannon-Fano Coding
Huffman's Improvement
Huffman Coding Examples
Huffman Coding Implementation
Recap
At , the entropy was calculated with log base 10 instead of the expected log base 2,. The correct values should be HP = 1.49 bits and HP = 0.47 bits.
At , all logarithms should be negated, I totally forgot about the negative sign.
At , I should have said the least likely symbols should have the *longest encoding.


Taught by

Reducible

Related Courses

Information Theory
The Chinese University of Hong Kong via Coursera
Fundamentals of Electrical Engineering
Rice University via Coursera
Computational Neuroscience
University of Washington via Coursera
Introduction to Complexity
Santa Fe Institute via Complexity Explorer
Tutorials for Complex Systems
Santa Fe Institute via Complexity Explorer