YoVDO

GPT-2: Language Models are Unsupervised Multitask Learners

Offered By: Yannic Kilcher via YouTube

Tags

GPT-2 Courses Artificial Intelligence Courses Machine Learning Courses ChatGPT Courses

Course Description

Overview

Explore OpenAI's groundbreaking GPT-2 language model and the controversy surrounding its release in this 28-minute video analysis. Delve into the model's ability to perform various natural language processing tasks without explicit supervision, including question answering, machine translation, reading comprehension, and summarization. Examine how GPT-2, trained on the massive WebText dataset, achieves state-of-the-art results on multiple language modeling benchmarks in a zero-shot setting. Discover the potential implications of this technology for building more advanced language processing systems that learn from naturally occurring demonstrations, while considering the ethical concerns and debates sparked by its development.

Syllabus

GPT-2: Language Models are Unsupervised Multitask Learners


Taught by

Yannic Kilcher

Related Courses

ChatGPT et IA : mode d'emploi pour managers et RH
CNAM via France Université Numerique
Generating New Recipes using GPT-2
Coursera Project Network via Coursera
Deep Learning NLP: Training GPT-2 from scratch
Coursera Project Network via Coursera
Data Science A-Z: Hands-On Exercises & ChatGPT Prize [2024]
Udemy
Deep Learning A-Z 2024: Neural Networks, AI & ChatGPT Prize
Udemy