YoVDO

TensorFlow: Working with NLP

Offered By: LinkedIn Learning

Tags

TensorFlow Courses BERT Courses Transfer Learning Courses Transformers Courses Transformer Architecture Courses Self-Attention Courses Fine-Tuning Courses

Course Description

Overview

Learn about using transformers in natural language processing with TensorFlow.

Syllabus

Introduction
  • Why TensorFlow?
  • What you should know
  • What is TensorFlow?
1. NLP and Transformers
  • What is NLP?
  • Transformers, their use, and history
  • Transformers for NLP
  • Challenge: NLP model size
  • Solution: NLP model size
2. BERT and Transfer Learning
  • Bias in BERT and GPT
  • How was BERT trained?
  • Transfer learning
3. Transformers and BERT
  • Transformer: Architecture overview
  • BERT model and tokenization
  • Tokenizers
  • Self-attention
  • Multi-head attention and feedforward network
  • Fine-tuning BERT
Conclusion
  • Next steps

Taught by

Jonathan Fernandes

Related Courses

Natural Language Processing with Attention Models
DeepLearning.AI via Coursera
Large Language Models: Foundation Models from the Ground Up
Databricks via edX
Artificial Intelligence in Social Media Analytics
Johns Hopkins University via Coursera
Chatbots
Johns Hopkins University via Coursera
Embedding Models: From Architecture to Implementation
DeepLearning.AI via Coursera