Generative AI: Working with Large Language Models
Offered By: LinkedIn Learning
Course Description
Overview
Explore a user-friendly approach to working with transformers and large language models for natural language processing.
Syllabus
Introduction
- Learning about Large Language Models
- What are large language models?
- Transformers in production
- Transformers: History
- Transfer learning
- Transformer: Architecture overview
- Self-attention
- Multi-head attention and Feed Forward Network
- GPT-3
- GPT-3 use cases
- Challenges and shortcomings of GPT-3
- GLaM
- Megatron-Turing NLG Model
- Gopher
- Scaling laws
- Chinchilla
- BIG-bench
- PaLM
- OPT and BLOOM
- Going further with Transformers
Taught by
Jonathan Fernandes
Related Courses
How to Build Codex SolutionsMicrosoft via YouTube Unlocking the Power of OpenAI for Startups - Microsoft for Startups
Microsoft via YouTube Building Intelligent Applications with World-Class AI
Microsoft via YouTube Stanford Seminar - Transformers in Language: The Development of GPT Models Including GPT-3
Stanford University via YouTube ChatGPT: GPT-3, GPT-4 Turbo: Unleash the Power of LLM's
Udemy