Context-based Arithmetic Coding and LLM Compression - Stanford EE274 Lecture 9
Offered By: Stanford University via YouTube
Course Description
Overview
Explore advanced data compression techniques in this Stanford University lecture focusing on context-based arithmetic coding and large language model (LLM) compression. Delve into the intricacies of these cutting-edge methods as presented by Professor Tsachy Weissman, an expert in electrical engineering, along with researchers Shubham Chandak and Pulkit Tandon. Gain insights into the theoretical foundations and practical applications of context-based arithmetic coding, and discover how these principles are applied to compress large language models. Follow along with the comprehensive course materials available on the Stanford Data Compression Class website, and consider enrolling in the full online course for a deeper understanding of data compression theory and its real-world implementations.
Syllabus
Stanford EE274: Data Compression I 2023 I Lecture 9 - Context-based AC & LLM Compression
Taught by
Stanford Online
Tags
Related Courses
Probabilistic Graphical Models 1: RepresentationStanford University via Coursera Computer Security
Stanford University via Coursera Intro to Computer Science
University of Virginia via Udacity Introduction to Logic
Stanford University via Coursera Internet History, Technology, and Security
University of Michigan via Coursera