YoVDO

TensorFlow: Working with NLP

Offered By: LinkedIn Learning

Tags

TensorFlow Courses BERT Courses Transfer Learning Courses Transformers Courses Transformer Architecture Courses Self-Attention Courses Fine-Tuning Courses

Course Description

Overview

Learn about using transformers in natural language processing with TensorFlow.

Syllabus

Introduction
  • Why TensorFlow?
  • What you should know
  • What is TensorFlow?
1. NLP and Transformers
  • What is NLP?
  • Transformers, their use, and history
  • Transformers for NLP
  • Challenge: NLP model size
  • Solution: NLP model size
2. BERT and Transfer Learning
  • Bias in BERT and GPT
  • How was BERT trained?
  • Transfer learning
3. Transformers and BERT
  • Transformer: Architecture overview
  • BERT model and tokenization
  • Tokenizers
  • Self-attention
  • Multi-head attention and feedforward network
  • Fine-tuning BERT
Conclusion
  • Next steps

Taught by

Jonathan Fernandes

Related Courses

Artificial Intelligence Foundations: Neural Networks
LinkedIn Learning
Transformers: Text Classification for NLP Using BERT
LinkedIn Learning
BERTによる自然言語処理を学ぼう! -Attention、TransformerからBERTへとつながるNLP技術-
Udemy
Complete Natural Language Processing Tutorial in Python
Keith Galli via YouTube
Perceiver - General Perception with Iterative Attention
Yannic Kilcher via YouTube