Why Transformers Work
Offered By: EuroPython Conference via YouTube
Course Description
Overview
Explore the inner workings of machine learning algorithms used in Rasa, focusing on the transformer's role in replacing RNNs for natural language processing and dialogue handling. This technical talk from EuroPython 2020 delves into why transformers have become integral to many algorithms. Witness a live demonstration comparing typical LSTM errors with transformer performance, and gain insights through clear diagrams with minimal mathematical complexity. Learn about time series, word embeddings, attention mechanisms, transformer layers, and recurrent neural networks. Understand the advantages of transformer embeddings and see their practical application in a demo. Conclude with a Q&A session to address any lingering questions about this powerful machine learning technique.
Syllabus
Introduction
The Problem
Time Series
Word embeddings
Math trick
Attention
Attention Comparison
Transformer Layer
Recurrent Neural Networks
Transformer Embedding
Transformer Demo
Questions
Taught by
EuroPython Conference
Related Courses
A Brief History of Data StorageEuroPython Conference via YouTube Breaking the Stereotype - Evolution & Persistence of Gender Bias in Tech
EuroPython Conference via YouTube We Can Get More from Spatial, GIS, and Public Domain Datasets
EuroPython Conference via YouTube Using NLP to Detect Knots in Protein Structures
EuroPython Conference via YouTube The Challenges of Doing Infra-As-Code Without "The Cloud"
EuroPython Conference via YouTube