Mixtral Fine-Tuning and Inference - Advanced Guide
Offered By: Trelis Research via YouTube
Course Description
Overview
Explore the intricacies of Mixtral fine-tuning and inference in this 34-minute video tutorial from Trelis Research. Delve into the concept of Mixture of Experts, comparing basic transformers to this advanced approach. Review the Mixtral model card, set up inference and API, and learn to run Mixtral with 4-bit quantization. Discover function calling capabilities and dive deep into fine-tuning techniques for Mixtral. Gain valuable insights on fine-tuning and inference processes, with access to various resources including GitHub repositories, Hugging Face models, and presentation slides for comprehensive understanding.
Syllabus
Mistral Mixture of Experts
Video Overview
Why Mixture of Experts?
Basic Transformer vs Mixture of Experts
Mixtral Model Card Review
Mixtral Inference and API Setup
Running Mixtral with 4 bit quantisation
Function calling Mixtral
Fine-tuning Mixtral
Notes on Fine-Tuning and Inference
Taught by
Trelis Research
Related Courses
Linear CircuitsGeorgia Institute of Technology via Coursera مقدمة في هندسة الطاقة والقوى
King Abdulaziz University via Rwaq (رواق) Magnetic Materials and Devices
Massachusetts Institute of Technology via edX Linear Circuits 2: AC Analysis
Georgia Institute of Technology via Coursera Transmisión de energía eléctrica
Tecnológico de Monterrey via edX