Generative AI: Introduction to Large Language Models
Offered By: LinkedIn Learning
Course Description
Overview
Gain a foundational knowledge of how large language models and other Generative AI models work.
Syllabus
Introduction
- Generative AI with large language models
- What is generative AI?
- What is a large language model?
- Types of large language models
- The evolution of large language models
- What is a neural network?
- How do neural networks learn?
- Deep learning and its significance
- The Transformer architecture
- What is an encoder-decoder?
- The attention mechanism
- How does self-attention work?
- Common applications of large language models
- Challenges with large language models
- The future of large language models
- Next steps
Taught by
Frederick Nwanganga
Related Courses
Transformers: Text Classification for NLP Using BERTLinkedIn Learning TensorFlow: Working with NLP
LinkedIn Learning TransGAN - Two Transformers Can Make One Strong GAN - Machine Learning Research Paper Explained
Yannic Kilcher via YouTube Nyströmformer- A Nyström-Based Algorithm for Approximating Self-Attention
Yannic Kilcher via YouTube Recreate Google Translate - Model Training
Edan Meyer via YouTube