YoVDO

Unlocking Mixture of Experts - From One Know-it-all to a Group of Jedi Masters

Offered By: EuroPython Conference via YouTube

Tags

Mixture-of-Experts Courses Machine Learning Courses Predictive Modeling Courses Divide-and-Conquer Courses Model Optimization Courses Ensemble Models Courses Pre-trained Models Courses Fine-Tuning Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Embark on an exhilarating journey exploring the Mixture of Experts (MoE) technique in this 31-minute conference talk at EuroPython 2024. Delve into the practical and intuitive next step for elevating predictive powers of generalized know-it-all models, particularly in critical domains like healthcare. Discover the powerful Divide and Conquer principle behind MoE, its limitations, pros, and cons. Progress through a captivating exploration of insights, intuitive reasoning, and solid mathematical underpinnings, enriched with interesting examples. Survey the landscape from ensemble models to stacked estimators, gradually ascending to MoE. Explore challenges, alternative routes, and learn when to apply MoE effectively. Conclude with a business-oriented discussion on metrics around cost, latency, and throughput for MoE models. Gain access to resources for diving into pre-trained MoE models, fine-tuning them, or creating your own from scratch.

Syllabus

Unlocking Mixture of Experts : From 1 Know-it-all to group of Jedi Masters — Pranjal Biyani


Taught by

EuroPython Conference

Related Courses

Amazon SageMaker JumpStart Foundations (Japanese)
Amazon Web Services via AWS Skill Builder
AWS Flash - Generative AI with Diffusion Models
Amazon Web Services via AWS Skill Builder
AWS Flash - Operationalize Generative AI Applications (FMOps/LLMOps)
Amazon Web Services via AWS Skill Builder
AWS SimuLearn: Automate Fine-Tuning of an LLM
Amazon Web Services via AWS Skill Builder
AWS SimuLearn: Fine-Tune a Base Model with RLHF
Amazon Web Services via AWS Skill Builder