YoVDO

Model Merging and Mixtures of Experts - AI in Production

Offered By: MLOps.community via YouTube

Tags

Machine Learning Courses Neural Networks Courses MLOps Courses Fine-Tuning Courses Hugging Face Courses Mixture-of-Experts Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the cutting-edge techniques of model merging and Mixture of Experts (MoE) in this 11-minute conference talk from the AI in Production Conference. Dive into the popular open-source methods for combining fine-tuned models to create state-of-the-art LLMs. Learn the main concepts of model merging and gain hands-on experience implementing it using the mergekit library. Discover how to create your own models and upload them directly to the Hugging Face Hub with the provided notebook. Presented by Maxime Labonne, a Machine Learning Scientist at J.P. Morgan and Ph.D. holder from the Polytechnic Institute of Paris, this talk covers topics such as merging techniques, Slurp, DAT, Franken merges, and merging recipes. Gain valuable insights from an expert in the field and expand your knowledge of advanced LLM techniques.

Syllabus

Intro
Welcome
Why Merging
Merging Techniques
Slurp
DAT Ties
Path Through Technique
Franken merges
Mixture of experts
Merging recipes
Merging library
Fine Tuning
Conclusion
Outro


Taught by

MLOps.community

Related Courses

The AI Engineer Path
Scrimba
Developing Generative AI Applications with Python
IBM via edX
Models and Platforms for Generative AI
IBM via edX
Intro to Hugging Face
Codecademy
Large Language Models: Application through Production
Databricks via edX