Transformers from Scratch - Part 2: Building and Training a Weather Prediction Model
Offered By: Trelis Research via YouTube
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into the second part of a comprehensive video series on building transformers from scratch. Explore the differences between encoder and decoder architectures, understand the GPT-4o architecture, and revisit the transformer model for weather prediction. Learn about pre-layer norm versus post-layer norm, and compare RoPE with sinusoidal positional embeddings. Follow along as dummy data is generated, the transformer architecture is initialized, and a forward pass test is conducted. Set up and test a training loop on dummy data before importing real weather data. Visualize training results and evaluate the model's weather prediction capabilities. Discuss the implications of loss graph volatility and explore strategies for further model improvement. Access additional resources and a Colab notebook to enhance your learning experience.
Syllabus
Welcome and Link to Colab Notebook
Encoder versus Decoder Architectures
What is the GPT-4o architecture?
Recap of transformer for weather prediction
Pre layer norm versus post layer norm
RoPE vs Sinusoidal Positional Embeddings
Dummy Data Generation
Transformer Architecture Initialisation
Forward pass test
Training loop setup and test on dummy data
Weather data import
Training and Results Visualisation
Can the model predict the weather?
Is volatility in the loss graph a problem?
How to improve the model further?
Taught by
Trelis Research
Related Courses
4.0 Shades of Digitalisation for the Chemical and Process IndustriesUniversity of Padova via FutureLearn A Day in the Life of a Data Engineer
Amazon Web Services via AWS Skill Builder FinTech for Finance and Business Leaders
ACCA via edX Accounting Data Analytics
University of Illinois at Urbana-Champaign via Coursera Accounting Data Analytics
Coursera