YoVDO

Training Large Language Models - GPT-NeoX-20B, BigScience BLOOM, OPT-175B Explained

Offered By: Aleksa Gordić - The AI Epiphany via YouTube

Tags

Natural Language Processing (NLP) Courses Model Training Courses

Course Description

Overview

Explore three groundbreaking large language model projects in this comprehensive video lecture. Delve into the BLOOM 176 billion parameter model by BigScience, the 175 billion parameter OPT model, and the 20 billion parameter GPT-NeoX-20B. Gain insights into the challenges and experiences of training these massive language models, including cluster deletions and dataset anomalies. Examine the papers, code, and shared weights of each project to deepen your understanding of large language models. Learn about the training processes, technical specifications, and unique features of each model through detailed explanations and chronicles from the researchers involved.

Syllabus

Intro
sponsored Weights & Biases
BLOOM paper
BLOOM chronicles
OPT paper
OPT chronicles
GPT-NeoX-20B paper
Outro


Taught by

Aleksa Gordić - The AI Epiphany

Related Courses

How Google does Machine Learning en Español
Google Cloud via Coursera
Creating Custom Callbacks in Keras
Coursera Project Network via Coursera
Automatic Machine Learning with H2O AutoML and Python
Coursera Project Network via Coursera
AI in Healthcare Capstone
Stanford University via Coursera
AutoML con Pycaret y TPOT
Coursera Project Network via Coursera