Transformers: Text Classification for NLP Using BERT
Offered By: LinkedIn Learning
Course Description
Overview
Learn about transformers, the go-to architecture for NLP and computer vision tasks.
Syllabus
Introduction
- Natural language processing with transformers
- How to use the exercise files
- How transformers are used in NLP
- Transformers in production
- Transformers history
- Challenge: BERT model sizes
- Solution: BERT model sizes
- Bias in BERT
- How was BERT trained?
- Transfer learning
- Transformer: Architecture overview
- BERT model and tokenization
- Positional encodings and segment embeddings
- Tokenizers
- Self-attention
- Multi-head attention and feedforward network
- BERT and text classification
- The Datasets library
- Overview of IMDb dataset
- Using a tokenizer
- Tiny IMDb
- A training run
- Additional training runs
Taught by
Jonathan Fernandes
Related Courses
Structuring Machine Learning ProjectsDeepLearning.AI via Coursera Natural Language Processing on Google Cloud
Google Cloud via Coursera Introduction to Learning Transfer and Life Long Learning (3L)
University of California, Irvine via Coursera Advanced Deployment Scenarios with TensorFlow
DeepLearning.AI via Coursera Neural Style Transfer with TensorFlow
Coursera Project Network via Coursera