YoVDO

Pre-training and Pre-trained Models in Advanced NLP - Lecture 5

Offered By: Graham Neubig via YouTube

Tags

BERT Courses Language Models Courses T5 Courses Transformer Architecture Courses RoBERTa Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the fundamentals of pre-training and pre-trained models in this comprehensive lecture from CMU's Advanced NLP course. Delve into the key aspects of pre-training, including objectives and data sources. Examine the differences between open and closed models, and gain insights into representative pre-trained models in the field. Learn from expert instructor Xiang Yue as part of the CMU CS 11-711 curriculum, offering a deep dive into advanced natural language processing concepts.

Syllabus

CMU Advanced NLP Fall 2024 (5): Pre-training and Pre-trained Models


Taught by

Graham Neubig

Related Courses

Sentiment Analysis with Deep Learning using BERT
Coursera Project Network via Coursera
Natural Language Processing with Attention Models
DeepLearning.AI via Coursera
Fine Tune BERT for Text Classification with TensorFlow
Coursera Project Network via Coursera
Deploy a BERT question answering bot on Django
Coursera Project Network via Coursera
Generating discrete sequences: language and music
Ural Federal University via edX