YoVDO

Pre-training and Pre-trained Models in Advanced NLP - Lecture 5

Offered By: Graham Neubig via YouTube

Tags

BERT Courses Language Models Courses T5 Courses Transformer Architecture Courses RoBERTa Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the fundamentals of pre-training and pre-trained models in this comprehensive lecture from CMU's Advanced NLP course. Delve into the key aspects of pre-training, including objectives and data sources. Examine the differences between open and closed models, and gain insights into representative pre-trained models in the field. Learn from expert instructor Xiang Yue as part of the CMU CS 11-711 curriculum, offering a deep dive into advanced natural language processing concepts.

Syllabus

CMU Advanced NLP Fall 2024 (5): Pre-training and Pre-trained Models


Taught by

Graham Neubig

Related Courses

Artificial Intelligence Foundations: Neural Networks
LinkedIn Learning
Transformers: Text Classification for NLP Using BERT
LinkedIn Learning
TensorFlow: Working with NLP
LinkedIn Learning
BERTによる自然言語処理を学ぼう! -Attention、TransformerからBERTへとつながるNLP技術-
Udemy
Complete Natural Language Processing Tutorial in Python
Keith Galli via YouTube