YoVDO

TensorFlow: Working with NLP

Offered By: LinkedIn Learning

Tags

TensorFlow Courses BERT Courses Transfer Learning Courses Transformers Courses Transformer Architecture Courses Self-Attention Courses Fine-Tuning Courses

Course Description

Overview

Learn about using transformers in natural language processing with TensorFlow.

Syllabus

Introduction
  • Why TensorFlow?
  • What you should know
  • What is TensorFlow?
1. NLP and Transformers
  • What is NLP?
  • Transformers, their use, and history
  • Transformers for NLP
  • Challenge: NLP model size
  • Solution: NLP model size
2. BERT and Transfer Learning
  • Bias in BERT and GPT
  • How was BERT trained?
  • Transfer learning
3. Transformers and BERT
  • Transformer: Architecture overview
  • BERT model and tokenization
  • Tokenizers
  • Self-attention
  • Multi-head attention and feedforward network
  • Fine-tuning BERT
Conclusion
  • Next steps

Taught by

Jonathan Fernandes

Related Courses

Transformers: Text Classification for NLP Using BERT
LinkedIn Learning
TransGAN - Two Transformers Can Make One Strong GAN - Machine Learning Research Paper Explained
Yannic Kilcher via YouTube
Nyströmformer- A Nyström-Based Algorithm for Approximating Self-Attention
Yannic Kilcher via YouTube
Recreate Google Translate - Model Training
Edan Meyer via YouTube
Let's Build GPT - From Scratch, in Code, Spelled Out
Andrej Karpathy via YouTube