YoVDO

GPT-NeoX-20B - Open-Source Huge Language Model by EleutherAI - Interview With Co-Founder Connor Leahy

Offered By: Yannic Kilcher via YouTube

Tags

Model Training Courses

Course Description

Overview

Explore the development and capabilities of GPT-NeoX-20B, a 20 billion parameter open-source language model, in this insightful interview with EleutherAI co-founder Connor Leahy. Discover the process of training, hardware acquisition, and model performance. Learn about the differences between GPT-Neo, GPT-J, and GPT-NeoX, and gain insights into the challenges of training large language models. Find out how to try the model yourself using GooseAI and hear final thoughts on the project's impact and future potential.

Syllabus

- Intro
- Start of interview
- How did you get all the hardware?
- What's the scale of this model?
- A look into the experimental results
- Why are there GPT-Neo, GPT-J, and GPT-NeoX?
- How difficult is training these big models?
- Try out the model on GooseAI
- Final thoughts


Taught by

Yannic Kilcher

Related Courses

How Google does Machine Learning en EspaƱol
Google Cloud via Coursera
Creating Custom Callbacks in Keras
Coursera Project Network via Coursera
Automatic Machine Learning with H2O AutoML and Python
Coursera Project Network via Coursera
AI in Healthcare Capstone
Stanford University via Coursera
AutoML con Pycaret y TPOT
Coursera Project Network via Coursera