YoVDO

Why Transformers Work

Offered By: EuroPython Conference via YouTube

Tags

EuroPython Courses Machine Learning Courses Time Series Analysis Courses Long short-term memory (LSTM) Courses Transformers Courses Word Embeddings Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the inner workings of machine learning algorithms used in Rasa, focusing on the transformer's role in replacing RNNs for natural language processing and dialogue handling. This technical talk from EuroPython 2020 delves into why transformers have become integral to many algorithms. Witness a live demonstration comparing typical LSTM errors with transformer performance, and gain insights through clear diagrams with minimal mathematical complexity. Learn about time series, word embeddings, attention mechanisms, transformer layers, and recurrent neural networks. Understand the advantages of transformer embeddings and see their practical application in a demo. Conclude with a Q&A session to address any lingering questions about this powerful machine learning technique.

Syllabus

Introduction
The Problem
Time Series
Word embeddings
Math trick
Attention
Attention Comparison
Transformer Layer
Recurrent Neural Networks
Transformer Embedding
Transformer Demo
Questions


Taught by

EuroPython Conference

Related Courses

Linear Circuits
Georgia Institute of Technology via Coursera
مقدمة في هندسة الطاقة والقوى
King Abdulaziz University via Rwaq (رواق)
Magnetic Materials and Devices
Massachusetts Institute of Technology via edX
Linear Circuits 2: AC Analysis
Georgia Institute of Technology via Coursera
Transmisión de energía eléctrica
Tecnológico de Monterrey via edX