Deep Learning NLP: Training GPT-2 from scratch
Offered By: Coursera Project Network via Coursera
Course Description
Overview
In this 1-hour long project-based course, we will explore Transformer-based Natural Language Processing. Specifically, we will be taking a look at re-training or fine-tuning GPT-2, which is an NLP machine learning model based on the Transformer architecture. We will cover the history of GPT-2 and it's development, cover basics about the Transformer architecture, learn what type of training data to use and how to collect it, and finally, perform the fine tuning process. In the final task, we will discuss use cases and what the future holds for Transformer-based NLP.
I would encourage learners to do further research and experimentation with the GPT-2 model, as well as other NLP models!
Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
I would encourage learners to do further research and experimentation with the GPT-2 model, as well as other NLP models!
Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
Taught by
Charles Ivan Niswander II
Related Courses
ChatGPT et IA : mode d'emploi pour managers et RHCNAM via France Université Numerique Generating New Recipes using GPT-2
Coursera Project Network via Coursera Data Science A-Z: Hands-On Exercises & ChatGPT Prize [2024]
Udemy Deep Learning A-Z 2024: Neural Networks, AI & ChatGPT Prize
Udemy Machine Learning A-Z: AI, Python & R + ChatGPT Prize [2024]
Udemy