YoVDO

Pre-training and Pre-trained Models in Advanced NLP - Lecture 5

Offered By: Graham Neubig via YouTube

Tags

BERT Courses Language Models Courses T5 Courses Transformer Architecture Courses RoBERTa Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the fundamentals of pre-training and pre-trained models in this comprehensive lecture from CMU's Advanced NLP course. Delve into the key aspects of pre-training, including objectives and data sources. Examine the differences between open and closed models, and gain insights into representative pre-trained models in the field. Learn from expert instructor Xiang Yue as part of the CMU CS 11-711 curriculum, offering a deep dive into advanced natural language processing concepts.

Syllabus

CMU Advanced NLP Fall 2024 (5): Pre-training and Pre-trained Models


Taught by

Graham Neubig

Related Courses

Multi-Label Classification on Unhealthy Comments - Finetuning RoBERTa with PyTorch - Coding Tutorial
rupert ai via YouTube
Hugging Face Transformers - The Basics - Practical Coding Guides - NLP Models (BERT/RoBERTa)
rupert ai via YouTube
Programming Language of the Future: AI in Your Native Language
Linux Foundation via YouTube
Fine-tuning LLMs Without Maxing Out Your GPU - LoRA for Parameter-Efficient Training
Data Centric via YouTube
MLOps: OpenVino Quantized Pipeline for Grammatical Error Correction
The Machine Learning Engineer via YouTube