YoVDO

Recreate Google Translate - Model Training

Offered By: Edan Meyer via YouTube

Tags

Natural Language Processing (NLP) Courses Deep Learning Courses Attention Mechanisms Courses Self-Attention Courses

Course Description

Overview

Explore the final video in the Neural Machine Translation (NMT) series, focusing on model training and testing for recreating Google Translate. Learn about parameters, training processes, and testing methodologies. Dive into the practical application of concepts covered in previous videos, including NLP models for sequential data, attention mechanisms, self-attention, the mT5 model, and the Hugging Face library for transformers. Witness the recreation of a demo and observe real-time translation tests. Access the GitHub repository, Colab code, and additional resources to deepen your understanding of transformer models and their applications in multilingual machine translation.

Syllabus

Intro
Parameters
Training
Testing
Results
Our App


Taught by

Edan Meyer

Related Courses

Transformers: Text Classification for NLP Using BERT
LinkedIn Learning
TensorFlow: Working with NLP
LinkedIn Learning
TransGAN - Two Transformers Can Make One Strong GAN - Machine Learning Research Paper Explained
Yannic Kilcher via YouTube
Nyströmformer- A Nyström-Based Algorithm for Approximating Self-Attention
Yannic Kilcher via YouTube
Let's Build GPT - From Scratch, in Code, Spelled Out
Andrej Karpathy via YouTube