YoVDO

TensorFlow: Working with NLP

Offered By: LinkedIn Learning

Tags

TensorFlow Courses BERT Courses Transfer Learning Courses Transformers Courses Transformer Architecture Courses Self-Attention Courses Fine-Tuning Courses

Course Description

Overview

Learn about using transformers in natural language processing with TensorFlow.

Syllabus

Introduction
  • Why TensorFlow?
  • What you should know
  • What is TensorFlow?
1. NLP and Transformers
  • What is NLP?
  • Transformers, their use, and history
  • Transformers for NLP
  • Challenge: NLP model size
  • Solution: NLP model size
2. BERT and Transfer Learning
  • Bias in BERT and GPT
  • How was BERT trained?
  • Transfer learning
3. Transformers and BERT
  • Transformer: Architecture overview
  • BERT model and tokenization
  • Tokenizers
  • Self-attention
  • Multi-head attention and feedforward network
  • Fine-tuning BERT
Conclusion
  • Next steps

Taught by

Jonathan Fernandes

Related Courses

Linear Circuits
Georgia Institute of Technology via Coursera
مقدمة في هندسة الطاقة والقوى
King Abdulaziz University via Rwaq (رواق)
Magnetic Materials and Devices
Massachusetts Institute of Technology via edX
Linear Circuits 2: AC Analysis
Georgia Institute of Technology via Coursera
Transmisión de energía eléctrica
Tecnológico de Monterrey via edX