YoVDO

Transformer Models and BERT Model

Offered By: Pluralsight

Tags

BERT Courses Deep Learning Courses Text Classification Courses Transformer Architecture Courses Self-Attention Mechanisms Courses Natural Language Inference Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
This course introduces you to the Transformer architecture and the Bidirectional Encoder Representations from Transformers (BERT) model.

This course introduces you to the Transformer architecture and the Bidirectional Encoder Representations from Transformers (BERT) model. You learn about the main components of the Transformer architecture, such as the self-attention mechanism, and how it is used to build the BERT model. You also learn about the different tasks that BERT can be used for, such as text classification, question answering, and natural language inference.

This course is estimated to take approximately 45 minutes to complete.


Syllabus

  • Introduction 23mins
  • Introduction 23mins

Taught by

Pluralsight

Related Courses

Artificial Intelligence Foundations: Neural Networks
LinkedIn Learning
Transformers: Text Classification for NLP Using BERT
LinkedIn Learning
TensorFlow: Working with NLP
LinkedIn Learning
BERTによる自然言語処理を学ぼう! -Attention、TransformerからBERTへとつながるNLP技術-
Udemy
Complete Natural Language Processing Tutorial in Python
Keith Galli via YouTube