YoVDO

LLMOPS: Training a Small LLM (GPT-2) - Machine Learning and Data Science

Offered By: The Machine Learning Engineer via YouTube

Tags

GPT-2 Courses Machine Learning Courses Deep Learning Courses Neural Networks Courses Model Training Courses Transformer Architecture Courses Fine-Tuning Courses LLMOps Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the process of training a small Language Model (LLM) - specifically a base GPT2 with 117 million parameters - on a local computer. Delve into the challenges associated with training LLMs from scratch and understand why this task is primarily undertaken by large organizations. Learn about the intricacies of LLM training, including data preparation, model architecture, and computational requirements. Gain insights into the LLMOPS (Language Model Operations) workflow and its importance in the field of machine learning and data science. Access accompanying notebooks on GitHub to follow along with the practical implementation. This 53-minute video provides a comprehensive look at the complexities and considerations involved in training and deploying smaller-scale language models.

Syllabus

LLMOPS : Train a LLM nanoGPT (GPT2) #machinelearning #datascience


Taught by

The Machine Learning Engineer

Related Courses

Large Language Models: Application through Production
Databricks via edX
LLMOps - LLM Bootcamp
The Full Stack via YouTube
MLOps: Why DevOps Solutions Fall Short in the Machine Learning World
Linux Foundation via YouTube
Quick Wins Across the Enterprise with Responsible AI
Microsoft via YouTube
End-to-End AI App Development: Prompt Engineering to LLMOps
Microsoft via YouTube