WGAN Implementation From Scratch With Gradient Penalty
Offered By: Aladdin Persson via YouTube
Course Description
Overview
Implement WGAN and WGAN-GP from scratch in PyTorch in this 26-minute tutorial video. Learn about improvements to the GAN loss function aimed at enhancing training stability. Explore the theory behind Wasserstein GANs, dive into implementation details, and code both WGAN and WGAN-GP variants. Gain insights into gradient penalty techniques and their impact on GAN training. Follow along with step-by-step explanations and practical coding demonstrations to deepen your understanding of advanced GAN architectures.
Syllabus
- Introduction
- Understanding WGAN
- WGAN Implementation details
- Coding WGAN
- Understanding WGAN-GP
- Coding WGAN-GP
- Ending
Taught by
Aladdin Persson
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX