YoVDO

CMU Advanced NLP: How to Use Pre-Trained Models

Offered By: Graham Neubig via YouTube

Tags

Natural Language Processing (NLP) Courses Pre-trained Models Courses Fine-Tuning Courses In-context Learning Courses

Course Description

Overview

Explore the intricacies of utilizing pre-trained models in this guest lecture by Aditi Raghunathan for CMU's Advanced NLP course. Delve into the reasons behind pretraining, examine satellite remote sensing applications, and compare different fine-tuning methods. Investigate empirical observations, linear probing techniques, and various regularizers. Engage with in-context learning concepts, develop mental models, and understand latent concepts and problem distributions. Participate in a thought-provoking discussion and question session to deepen your understanding of advanced natural language processing techniques.

Syllabus

Intro
Why pretraining
Satellite Remote Sensing
Fine Tuning Pretrained Models
Comparing Two Methods of Fine Tuning
Empirical Observations
Linear probing
Finetuning
Regularizers
Linear Probe
Questions
Discussion
In Context Learning
Whats Happening Here
Mental Model
Latent Concept
Problem Distribution


Taught by

Graham Neubig

Related Courses

Stanford Seminar 2022 - Transformer Circuits, Induction Heads, In-Context Learning
Stanford University via YouTube
Pretraining Task Diversity and the Emergence of Non-Bayesian In-Context Learning for Regression
Simons Institute via YouTube
In-Context Learning: A Case Study of Simple Function Classes
Simons Institute via YouTube
AI Mastery: Ultimate Crash Course in Prompt Engineering for Large Language Models
Data Science Dojo via YouTube
New Summarization Techniques for LLM Applications - Building a Note-Taking App
Sam Witteveen via YouTube