YoVDO

P-Tuning: A Parameter Efficient Tuning to Boost LLM Performance

Offered By: GAIA via YouTube

Tags

Swedish Courses Low-Resource Languages Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore parameter-efficient tuning techniques for boosting Large Language Model (LLM) performance in this 25-minute conference talk from the 2023 GAIA Conference. Delve into the adaptation of p-tuning, a prompt-learning method, for low-resource language settings, with a focus on Swedish. Learn about an improved version of p-tuning implemented in NVIDIA NeMo that enables continuous multitask learning of virtual prompts. Gain insights from Zenodia Charpy, a senior deep learning data scientist at NVIDIA, as she shares her expertise in training and deploying very large language models for non-English and low-resource languages. Discover how these techniques can help solve real-world natural language tasks and improve performance on various downstream NLP tasks while grounding the factual correctness of LLM responses.

Syllabus

P-Tuning: A Parameter Efficient Tuning to Boost LLM Performance by Zenodia Charpy


Taught by

GAIA

Related Courses

Building Transformer Tokenizers - Dhivehi NLP #1
James Briggs via YouTube
Low Resource Machine Translation
Alfredo Canziani via YouTube
CMU Multilingual NLP - The LORELEI Project
Graham Neubig via YouTube
CMU Multilingual NLP - Information Extraction
Graham Neubig via YouTube
CMU Multilingual NLP 2020 - Text to Speech
Graham Neubig via YouTube