Generative AI: Working with Large Language Models
Offered By: LinkedIn Learning
Course Description
Overview
Explore a user-friendly approach to working with transformers and large language models for natural language processing.
Syllabus
Introduction
- Learning about Large Language Models
- What are large language models?
- Transformers in production
- Transformers: History
- Transfer learning
- Transformer: Architecture overview
- Self-attention
- Multi-head attention and Feed Forward Network
- GPT-3
- GPT-3 use cases
- Challenges and shortcomings of GPT-3
- GLaM
- Megatron-Turing NLG Model
- Gopher
- Scaling laws
- Chinchilla
- BIG-bench
- PaLM
- OPT and BLOOM
- Going further with Transformers
Taught by
Jonathan Fernandes
Related Courses
Structuring Machine Learning ProjectsDeepLearning.AI via Coursera Natural Language Processing on Google Cloud
Google Cloud via Coursera Introduction to Learning Transfer and Life Long Learning (3L)
University of California, Irvine via Coursera Advanced Deployment Scenarios with TensorFlow
DeepLearning.AI via Coursera Neural Style Transfer with TensorFlow
Coursera Project Network via Coursera