YoVDO

Pre-training and Fine-tuning of Code Generation Models

Offered By: CNCF [Cloud Native Computing Foundation] via YouTube

Tags

Machine Learning Courses Programming Languages Courses Transformers Courses Model Deployment Courses Responsible AI Courses PEFT Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the behind-the-scenes process of building and training large code models like StarCoder in this keynote presentation. Delve into the remarkable abilities of large language models trained on code for code completion and synthesis from natural language descriptions. Learn about the development of StarCoder, a robust 15B Code Generation model trained across 80+ programming languages, while incorporating responsible AI practices. Discover how to leverage these models using open-source libraries such as transformers and PEFT, and gain insights into efficient deployment strategies. Gain valuable knowledge about the pre-training and fine-tuning techniques used in code generation models, presented by Loubna Ben-Allal, a Machine Learning Engineer from Hugging Face.

Syllabus

Keynote: Pre-training and Fine-tuning of Code Generation Models - Loubna Ben-Allal, Hugging Face


Taught by

CNCF [Cloud Native Computing Foundation]

Related Courses

Linear Circuits
Georgia Institute of Technology via Coursera
مقدمة في هندسة الطاقة والقوى
King Abdulaziz University via Rwaq (رواق)
Magnetic Materials and Devices
Massachusetts Institute of Technology via edX
Linear Circuits 2: AC Analysis
Georgia Institute of Technology via Coursera
Transmisión de energía eléctrica
Tecnológico de Monterrey via edX