YoVDO

GPT-2: Language Models are Unsupervised Multitask Learners

Offered By: Yannic Kilcher via YouTube

Tags

GPT-2 Courses Artificial Intelligence Courses Machine Learning Courses ChatGPT Courses

Course Description

Overview

Explore OpenAI's groundbreaking GPT-2 language model and the controversy surrounding its release in this 28-minute video analysis. Delve into the model's ability to perform various natural language processing tasks without explicit supervision, including question answering, machine translation, reading comprehension, and summarization. Examine how GPT-2, trained on the massive WebText dataset, achieves state-of-the-art results on multiple language modeling benchmarks in a zero-shot setting. Discover the potential implications of this technology for building more advanced language processing systems that learn from naturally occurring demonstrations, while considering the ethical concerns and debates sparked by its development.

Syllabus

GPT-2: Language Models are Unsupervised Multitask Learners


Taught by

Yannic Kilcher

Related Courses

Generating New Recipes using GPT-2
Coursera Project Network via Coursera
Deep Learning NLP: Training GPT-2 from scratch
Coursera Project Network via Coursera
Artificial Creativity
Parsons School of Design via Coursera
Coding Train Late Night - GPT-2, Hue Lights, Discord Bot
Coding Train via YouTube
Coding Train Late Night - Fetch, GPT-2 and RunwayML
Coding Train via YouTube