YoVDO

Mixtral Fine-Tuning and Inference - Advanced Guide

Offered By: Trelis Research via YouTube

Tags

Machine Learning Courses Quantization Courses Transformers Courses Inference Courses Fine-Tuning Courses Mixture-of-Experts Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricacies of Mixtral fine-tuning and inference in this 34-minute video tutorial from Trelis Research. Delve into the concept of Mixture of Experts, comparing basic transformers to this advanced approach. Review the Mixtral model card, set up inference and API, and learn to run Mixtral with 4-bit quantization. Discover function calling capabilities and dive deep into fine-tuning techniques for Mixtral. Gain valuable insights on fine-tuning and inference processes, with access to various resources including GitHub repositories, Hugging Face models, and presentation slides for comprehensive understanding.

Syllabus

Mistral Mixture of Experts
Video Overview
Why Mixture of Experts?
Basic Transformer vs Mixture of Experts
Mixtral Model Card Review
Mixtral Inference and API Setup
Running Mixtral with 4 bit quantisation
Function calling Mixtral
Fine-tuning Mixtral
Notes on Fine-Tuning and Inference


Taught by

Trelis Research

Related Courses

Discrete Inference and Learning in Artificial Vision
École Centrale Paris via Coursera
Teaching Literacy Through Film
The British Film Institute via FutureLearn
Linear Regression and Modeling
Duke University via Coursera
Probability and Statistics
Stanford University via Stanford OpenEdx
Statistical Reasoning
Stanford University via Stanford OpenEdx