Transformer Models and BERT Model
Offered By: Pluralsight
Course Description
Overview
          This course introduces you to the Transformer architecture and the Bidirectional Encoder Representations from Transformers (BERT) model.
This course introduces you to the Transformer architecture and the Bidirectional Encoder Representations from Transformers (BERT) model. You learn about the main components of the Transformer architecture, such as the self-attention mechanism, and how it is used to build the BERT model. You also learn about the different tasks that BERT can be used for, such as text classification, question answering, and natural language inference.
This course introduces you to the Transformer architecture and the Bidirectional Encoder Representations from Transformers (BERT) model. You learn about the main components of the Transformer architecture, such as the self-attention mechanism, and how it is used to build the BERT model. You also learn about the different tasks that BERT can be used for, such as text classification, question answering, and natural language inference.
This course is estimated to take approximately 45 minutes to complete.
Syllabus
- Introduction 23mins
- Introduction 23mins
Taught by
Pluralsight
Related Courses
Today Unsupervised Sentence Transformers, Tomorrow Skynet - How TSDAE WorksJames Briggs via YouTube Tradeoffs Between Robustness and Accuracy - Percy Liang
Institute for Advanced Study via YouTube Improving Natural Language Understanding Through Adversarial Testing
Stanford University via YouTube Transformer Models and BERT Model
Google via Google Cloud Skills Boost Transformer Models and BERT Model - Locales
Google via Google Cloud Skills Boost
