YoVDO

Instruction Tuning of Large Language Models - Lecture

Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube

Tags

ChatGPT Courses GPT-4 Courses Crowdsourcing Courses Multi-Task Learning Courses Generalization Courses Instruction-Tuning Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the concept of instruction tuning for large language models in this 48-minute lecture by Yizhong Wang from the University of Washington. Delve into the evolution of NLP models, from task-specific approaches to generalist models like ChatGPT and GPT-4. Examine the impact of expert-written instructions and cross-task generalization on model performance. Investigate the factors contributing to LLM improvements, including data quality and quantity. Learn about innovative techniques for generating instruction datasets using GPT-3, and evaluate their effectiveness through performance metrics and expert assessments. Consider the implications of data size and quality on model capabilities, and reflect on potential licensing concerns related to using OpenAI-generated content.

Syllabus

Intro
ChatGPT/GPT4 are real generalists
How did models acquire the vast capabilities?
NLP before 2018: building task-specific models
Classical multi-task learning
Generalization to unseen tasks via instructions
Expert-written instructions for all tasks
Strict train/test split for cross-task generalization
Instruction tuning significantly improves LLMs
What are the most important factors?
Other models trained on existing NLP datasets
Data is OpenAl's secret weapon
Can we construct a similar instruction dataset by crowdsourcing?
LLMs can be prompted to generate instructions
LM can be prompted to generate instances
Instruction data generation pipeline
Generating 52K instructions with GPT3
Tasks generated by GPT3
Data quality review
Performance on SuperNI
Expert evaluation on 252 user-oriented instructions
Effect of data size and data quality (using human eval)
Takeaways
Licensing concern about using OpenAl output?


Taught by

Center for Language & Speech Processing(CLSP), JHU

Related Courses

AI Foundations: Prompt Engineering with ChatGPT
Arizona State University via Coursera
AI para docentes: Transforma tu enseñanza con ChatGPT
Universidad Anáhuac via Coursera
Intro to AI for Digital Marketing
Davidson College via edX
AI Prompt Engineering for Beginners
Davidson College via edX
Herramientas de Inteligencia Artificial para la productividad. Más allá del ChatGPT
Universitat Politècnica de València via edX