YoVDO

Deep Learning NLP: Training GPT-2 from scratch

Offered By: Coursera Project Network via Coursera

Tags

GPT-2 Courses Natural Language Processing (NLP) Courses Chatbot Courses ChatGPT Courses

Course Description

Overview

In this 1-hour long project-based course, we will explore Transformer-based Natural Language Processing. Specifically, we will be taking a look at re-training or fine-tuning GPT-2, which is an NLP machine learning model based on the Transformer architecture. We will cover the history of GPT-2 and it's development, cover basics about the Transformer architecture, learn what type of training data to use and how to collect it, and finally, perform the fine tuning process. In the final task, we will discuss use cases and what the future holds for Transformer-based NLP.

I would encourage learners to do further research and experimentation with the GPT-2 model, as well as other NLP models!

Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

Taught by

Charles Ivan Niswander II

Related Courses

Natural Language Processing
Columbia University via Coursera
Natural Language Processing
Stanford University via Coursera
Introduction to Natural Language Processing
University of Michigan via Coursera
moocTLH: Nuevos retos en las tecnologías del lenguaje humano
Universidad de Alicante via Miríadax
Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam