YoVDO

Mixtral Fine-Tuning and Inference - Advanced Guide

Offered By: Trelis Research via YouTube

Tags

Machine Learning Courses Quantization Courses Transformers Courses Inference Courses Fine-Tuning Courses Mixture-of-Experts Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricacies of Mixtral fine-tuning and inference in this 34-minute video tutorial from Trelis Research. Delve into the concept of Mixture of Experts, comparing basic transformers to this advanced approach. Review the Mixtral model card, set up inference and API, and learn to run Mixtral with 4-bit quantization. Discover function calling capabilities and dive deep into fine-tuning techniques for Mixtral. Gain valuable insights on fine-tuning and inference processes, with access to various resources including GitHub repositories, Hugging Face models, and presentation slides for comprehensive understanding.

Syllabus

Mistral Mixture of Experts
Video Overview
Why Mixture of Experts?
Basic Transformer vs Mixture of Experts
Mixtral Model Card Review
Mixtral Inference and API Setup
Running Mixtral with 4 bit quantisation
Function calling Mixtral
Fine-tuning Mixtral
Notes on Fine-Tuning and Inference


Taught by

Trelis Research

Related Courses

Aerial Image Segmentation with PyTorch
Coursera Project Network via Coursera
Discrete Inference and Learning in Artificial Vision
École Centrale Paris via Coursera
Building Language Models on AWS (Japanese) 日本語字幕版
Amazon Web Services via AWS Skill Builder
ChatGPT Prompt Engineering for Developers
DeepLearning.AI via Independent
Introduction to Bayesian Statistics
Databricks via Coursera