Building Makemore - Building a WaveNet
Offered By: Andrej Karpathy via YouTube
Course Description
Overview
Dive into an in-depth tutorial on building a WaveNet-like convolutional neural network architecture, expanding upon a 2-layer MLP from previous lessons. Explore the process of deepening the network with a tree-like structure, mirroring the WaveNet (2016) architecture from DeepMind. Gain valuable insights into torch.nn, its inner workings, and typical deep learning development processes. Follow along as the instructor navigates through documentation, manages multidimensional tensor shapes, and transitions between Jupyter notebooks and repository code. Learn how to implement and train the WaveNet model, address common bugs, and scale up the architecture. Discover the concept of dilated causal convolutions and their efficient implementation in deep learning models. Conclude with an experimental harness, discussions on improving model performance, and future directions for enhancing WaveNet on the given dataset.
Syllabus
intro
starter code walkthrough
let’s fix the learning rate plot
pytorchifying our code: layers, containers, torch.nn, fun bugs
overview: WaveNet
dataset bump the context size to 8
re-running baseline code on block_size 8
implementing WaveNet
training the WaveNet: first pass
fixing batchnorm1d bug
re-training WaveNet with bug fix
scaling up our WaveNet
experimental harness
WaveNet but with “dilated causal convolutions”
torch.nn
the development process of building deep neural nets
going forward
improve on my loss! how far can we improve a WaveNet on this data?
Taught by
Andrej Karpathy
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologías del lenguaje humano
Universidad de Alicante via Miríadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam