YoVDO

TensorFlow: Working with NLP

Offered By: LinkedIn Learning

Tags

TensorFlow Courses BERT Courses Transfer Learning Courses Transformers Courses Transformer Architecture Courses Self-Attention Courses Fine-Tuning Courses

Course Description

Overview

Learn about using transformers in natural language processing with TensorFlow.

Syllabus

Introduction
  • Why TensorFlow?
  • What you should know
  • What is TensorFlow?
1. NLP and Transformers
  • What is NLP?
  • Transformers, their use, and history
  • Transformers for NLP
  • Challenge: NLP model size
  • Solution: NLP model size
2. BERT and Transfer Learning
  • Bias in BERT and GPT
  • How was BERT trained?
  • Transfer learning
3. Transformers and BERT
  • Transformer: Architecture overview
  • BERT model and tokenization
  • Tokenizers
  • Self-attention
  • Multi-head attention and feedforward network
  • Fine-tuning BERT
Conclusion
  • Next steps

Taught by

Jonathan Fernandes

Related Courses

Sentiment Analysis with Deep Learning using BERT
Coursera Project Network via Coursera
Natural Language Processing with Attention Models
DeepLearning.AI via Coursera
Fine Tune BERT for Text Classification with TensorFlow
Coursera Project Network via Coursera
Deploy a BERT question answering bot on Django
Coursera Project Network via Coursera
Generating discrete sequences: language and music
Ural Federal University via edX