Model Distillation - From Large Models to Efficient Enterprise Solutions
Offered By: Snorkel AI via YouTube
Course Description
Overview
Explore the concept of "Model Distillation" in this 57-minute video featuring Charlie Dickens, an applied research scientist, and Shane Johnson, senior director of product marketing at Snorkel AI. Learn how this essential technique builds more efficient enterprise AI systems, particularly in natural language processing. Discover various distillation methods, including knowledge extraction and transfer, and see real-world examples of their applications. Gain insights into Stanford's innovative approach using GPT-3 and NVIDIA's cutting-edge research. Understand strategies for selecting appropriate teacher and student models, ensuring data quality, and optimizing performance. Equip yourself with the knowledge to effectively harness model distillation in your enterprise, whether you're an AI professional or a business leader seeking advanced AI solutions.
Syllabus
Introductions
Snorkel Flow overview
Why distillation
LLM distillation in action
Knowledge distillation
Snorkel AI's approach to knowledge distillation
Taxonomy development at scale
Case study: classifying provisions in legal contracts
: Distillation takeaways: Challenges and best practices
Model distillation Q&A
Taught by
Snorkel AI
Related Courses
Solving the Last Mile Problem of Foundation Models with Data-Centric AIMLOps.community via YouTube Foundational Models in Enterprise AI - Challenges and Opportunities
MLOps.community via YouTube Knowledge Distillation Demystified: Techniques and Applications
Snorkel AI via YouTube Curate Training Data via Labeling Functions - 10 to 100x Faster
Snorkel AI via YouTube Task Me Anything: Revolutionizing Multimodal Model Benchmarking
Snorkel AI via YouTube