YoVDO

MiniLLM: Knowledge Distillation of Large Language Models

Offered By: Unify via YouTube

Tags

Machine Learning Courses Neural Networks Courses Conversational AI Courses Model Compression Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 52-minute presentation on Knowledge Distillation of Large Language Models by Yuxian Gu, a PhD student at Tsinghua University. Delve into a novel method replacing forward Kullback-Leibler divergence with reverse KLD in standard knowledge distillation approaches for large language models. Discover how this technique prevents student models from overestimating low-probability regions of teacher distributions, resulting in MiniLLMs that generate more precise and higher-quality responses compared to traditional knowledge distillation baselines. Learn about the research paper, its authors, and related resources. Gain insights into AI optimization, language models, and cutting-edge knowledge distillation techniques. Access additional materials including AI research trends, deployment strategies, and connect with the Unify community through various platforms.

Syllabus

MiniLLM: Knowledge Distillation of Large Language Models


Taught by

Unify

Related Courses

Microsoft Azure Developer: Creating and Integrating AI with Azure Services
Pluralsight
Building Bots with Node.js
LinkedIn Learning
Artificial Intelligence on Microsoft Azure
Microsoft via Coursera
Preparing for AI-900: Microsoft Azure AI Fundamentals exam
Microsoft via Coursera
Build a Conversational AI Solution with Microsoft Azure
Pluralsight