YoVDO

Generative AI: Working with Large Language Models

Offered By: LinkedIn Learning

Tags

GPT-3 Courses Transfer Learning Courses Transformers Courses Self-Attention Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a user-friendly approach to working with transformers and large language models for natural language processing.

Syllabus

Introduction
  • Learning about Large Language Models
1. Transformers in NLP
  • What are large language models?
  • Transformers in production
  • Transformers: History
2. Training Transformers and Their Architecture
  • Transfer learning
  • Transformer: Architecture overview
  • Self-attention
  • Multi-head attention and Feed Forward Network
3. Large Language Models
  • GPT-3
  • GPT-3 use cases
  • Challenges and shortcomings of GPT-3
  • GLaM
  • Megatron-Turing NLG Model
  • Gopher
  • Scaling laws
  • Chinchilla
  • BIG-bench
  • PaLM
  • OPT and BLOOM
Conclusion
  • Going further with Transformers

Taught by

Jonathan Fernandes

Related Courses

Structuring Machine Learning Projects
DeepLearning.AI via Coursera
Natural Language Processing on Google Cloud
Google Cloud via Coursera
Introduction to Learning Transfer and Life Long Learning (3L)
University of California, Irvine via Coursera
Advanced Deployment Scenarios with TensorFlow
DeepLearning.AI via Coursera
Neural Style Transfer with TensorFlow
Coursera Project Network via Coursera