TensorFlow: Working with NLP
Offered By: LinkedIn Learning
Course Description
Overview
Learn about using transformers in natural language processing with TensorFlow.
Syllabus
Introduction
- Why TensorFlow?
- What you should know
- What is TensorFlow?
- What is NLP?
- Transformers, their use, and history
- Transformers for NLP
- Challenge: NLP model size
- Solution: NLP model size
- Bias in BERT and GPT
- How was BERT trained?
- Transfer learning
- Transformer: Architecture overview
- BERT model and tokenization
- Tokenizers
- Self-attention
- Multi-head attention and feedforward network
- Fine-tuning BERT
- Next steps
Taught by
Jonathan Fernandes
Related Courses
Transformers: Text Classification for NLP Using BERTLinkedIn Learning TransGAN - Two Transformers Can Make One Strong GAN - Machine Learning Research Paper Explained
Yannic Kilcher via YouTube Nyströmformer- A Nyström-Based Algorithm for Approximating Self-Attention
Yannic Kilcher via YouTube Recreate Google Translate - Model Training
Edan Meyer via YouTube Let's Build GPT - From Scratch, in Code, Spelled Out
Andrej Karpathy via YouTube