YoVDO

Transformers: Text Classification for NLP Using BERT

Offered By: LinkedIn Learning

Tags

BERT Courses Machine Learning Courses Deep Learning Courses Transfer Learning Courses Transformers Courses Text Classification Courses Transformer Architecture Courses Self-Attention Courses

Course Description

Overview

Learn about transformers, the go-to architecture for NLP and computer vision tasks.

Syllabus

Introduction
  • Natural language processing with transformers
  • How to use the exercise files
1. NLP and Transformers
  • How transformers are used in NLP
  • Transformers in production
  • Transformers history
  • Challenge: BERT model sizes
  • Solution: BERT model sizes
2. BERT and Transfer Learning
  • Bias in BERT
  • How was BERT trained?
  • Transfer learning
3. Transformer Architecture and BERT
  • Transformer: Architecture overview
  • BERT model and tokenization
  • Positional encodings and segment embeddings
  • Tokenizers
  • Self-attention
  • Multi-head attention and feedforward network
4. Text Classification
  • BERT and text classification
  • The Datasets library
  • Overview of IMDb dataset
  • Using a tokenizer
  • Tiny IMDb
  • A training run
Conclusion
  • Additional training runs

Taught by

Jonathan Fernandes

Related Courses

AWS Flash - Navigating the Large Language Models Landscape (Simplified Chinese)
Amazon Web Services via AWS Skill Builder
AWS Flash - Navigating the Large Language Models Landscape (Simplified Chinese) (中文讲师定制版)
Amazon Web Services via AWS Skill Builder
Generative AI Language Modeling with Transformers
IBM via Coursera
The Rise of Generative AI
Board Infinity via Coursera
Introduction to LLMs in Python
DataCamp