YoVDO

Transformers: Text Classification for NLP Using BERT

Offered By: LinkedIn Learning

Tags

BERT Courses Machine Learning Courses Deep Learning Courses Transfer Learning Courses Transformers Courses Text Classification Courses Transformer Architecture Courses Self-Attention Courses

Course Description

Overview

Learn about transformers, the go-to architecture for NLP and computer vision tasks.

Syllabus

Introduction
  • Natural language processing with transformers
  • How to use the exercise files
1. NLP and Transformers
  • How transformers are used in NLP
  • Transformers in production
  • Transformers history
  • Challenge: BERT model sizes
  • Solution: BERT model sizes
2. BERT and Transfer Learning
  • Bias in BERT
  • How was BERT trained?
  • Transfer learning
3. Transformer Architecture and BERT
  • Transformer: Architecture overview
  • BERT model and tokenization
  • Positional encodings and segment embeddings
  • Tokenizers
  • Self-attention
  • Multi-head attention and feedforward network
4. Text Classification
  • BERT and text classification
  • The Datasets library
  • Overview of IMDb dataset
  • Using a tokenizer
  • Tiny IMDb
  • A training run
Conclusion
  • Additional training runs

Taught by

Jonathan Fernandes

Related Courses

TensorFlow: Working with NLP
LinkedIn Learning
TransGAN - Two Transformers Can Make One Strong GAN - Machine Learning Research Paper Explained
Yannic Kilcher via YouTube
Nyströmformer- A Nyström-Based Algorithm for Approximating Self-Attention
Yannic Kilcher via YouTube
Recreate Google Translate - Model Training
Edan Meyer via YouTube
Let's Build GPT - From Scratch, in Code, Spelled Out
Andrej Karpathy via YouTube