YoVDO

P-Tuning: A Parameter Efficient Tuning to Boost LLM Performance

Offered By: GAIA via YouTube

Tags

Swedish Courses Low-Resource Languages Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore parameter-efficient tuning techniques for boosting Large Language Model (LLM) performance in this 25-minute conference talk from the 2023 GAIA Conference. Delve into the adaptation of p-tuning, a prompt-learning method, for low-resource language settings, with a focus on Swedish. Learn about an improved version of p-tuning implemented in NVIDIA NeMo that enables continuous multitask learning of virtual prompts. Gain insights from Zenodia Charpy, a senior deep learning data scientist at NVIDIA, as she shares her expertise in training and deploying very large language models for non-English and low-resource languages. Discover how these techniques can help solve real-world natural language tasks and improve performance on various downstream NLP tasks while grounding the factual correctness of LLM responses.

Syllabus

P-Tuning: A Parameter Efficient Tuning to Boost LLM Performance by Zenodia Charpy


Taught by

GAIA

Related Courses

Swedish Made Easy, Day 6 - Comfortable in 6 days
Udemy
Swedish Made Easy, Day 5 - Comfortable in 6 days
Udemy
Swedish Made Easy, Day 1 - Comfortable in 6 days
Udemy
Swedish Made Easy, Day 3 - Comfortable in 6 days
Udemy
Swedish Made Easy, Day 2 - Comfortable in 6 days
Udemy