YoVDO

Decoding Mistral AI's Large Language Models - Building Blocks and Training Strategies

Offered By: Databricks via YouTube

Tags

Mixture-of-Experts Courses Mistral AI Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the building blocks and training strategies powering Mistral AI's large language models in this 36-minute session presented by Devendra Singh Chaplot, Research Scientist at Mistral AI. Delve into the open-source models Mixtral 8x7B and Mixtral 8x22B, which utilize a mixture-of-experts (MoE) architecture and are released under the Apache 2.0 license. Gain insights on leveraging Mistral "La Plateforme" API endpoints and get a sneak peek at upcoming features. Learn about the latest advancements in language model technology and their practical applications in the field of artificial intelligence.

Syllabus

Decoding Mistral AI's Large Language Models


Taught by

Databricks

Related Courses

Comprehensive Guide to Large Language Models in 2024 - Usage and Selection
MattVidPro AI via YouTube
Mistral Large with Function Calling - Review and Implementation
Sam Witteveen via YouTube
Intro to Mistral AI
Scrimba
Intro to Mistral AI
Scrimba via Coursera
Getting Started with Mistral
DeepLearning.AI via Coursera