YoVDO

OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models

Offered By: Unify via YouTube

Tags

Mixture-of-Experts Courses Artificial Intelligence Courses Machine Learning Courses Neural Networks Courses Transformer Models Courses Language Models Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a comprehensive presentation on OpenMoE, an early effort in open mixture-of-experts language models, delivered by Fuzhao Xue. Dive into the intricacies of this innovative approach to large language models, including the development of a series of open-source, decoder-only MoE LLMs ranging from 650M to 34B parameters. Learn about the cost-effectiveness of MoE models compared to dense LLMs, and gain insights into the routing mechanisms within these models. Discover key concepts such as Context-Independent Specialization and the challenges in routing decisions. Access additional resources, including the original research paper and related content from Unify, to deepen your understanding of this cutting-edge AI technology.

Syllabus

OpenMoE Explained


Taught by

Unify

Related Courses

Microsoft Bot Framework and Conversation as a Platform
Microsoft via edX
Unlocking the Power of OpenAI for Startups - Microsoft for Startups
Microsoft via YouTube
Improving Customer Experiences with Speech to Text and Text to Speech
Microsoft via YouTube
Stanford Seminar - Deep Learning in Speech Recognition
Stanford University via YouTube
Select Topics in Python: Natural Language Processing
Codio via Coursera