Transformers: Text Classification for NLP Using BERT
Offered By: LinkedIn Learning
Course Description
Overview
Learn about transformers, the go-to architecture for NLP and computer vision tasks.
Syllabus
Introduction
- Natural language processing with transformers
- How to use the exercise files
- How transformers are used in NLP
- Transformers in production
- Transformers history
- Challenge: BERT model sizes
- Solution: BERT model sizes
- Bias in BERT
- How was BERT trained?
- Transfer learning
- Transformer: Architecture overview
- BERT model and tokenization
- Positional encodings and segment embeddings
- Tokenizers
- Self-attention
- Multi-head attention and feedforward network
- BERT and text classification
- The Datasets library
- Overview of IMDb dataset
- Using a tokenizer
- Tiny IMDb
- A training run
- Additional training runs
Taught by
Jonathan Fernandes
Related Courses
TensorFlow: Working with NLPLinkedIn Learning TransGAN - Two Transformers Can Make One Strong GAN - Machine Learning Research Paper Explained
Yannic Kilcher via YouTube Nyströmformer- A Nyström-Based Algorithm for Approximating Self-Attention
Yannic Kilcher via YouTube Recreate Google Translate - Model Training
Edan Meyer via YouTube Let's Build GPT - From Scratch, in Code, Spelled Out
Andrej Karpathy via YouTube