Neural Nets - Rotation and Squashing
Offered By: Alfredo Canziani via YouTube
Course Description
Overview
Explore the fundamentals of neural networks in this comprehensive lecture focusing on rotation and squashing operations. Delve into affine transformations and non-linearities, gaining intuitive understanding through visual explanations. Learn to implement 2x2 linear transformations using both Jupyter and PyTorch, and discover the power of activation functions like hyperbolic tangent and ReLU. Witness the construction of deep neural networks step-by-step, from basic components to complex architectures. Gain practical coding experience and theoretical insights, concluding with a thorough summary of key concepts in neural network design and functionality.
Syllabus
– Welcome!
– Affine transformations and non-linearities
– Affine transformation: intuition
– Summary slide
– Jupyter and PyTorch
– Input data
– Coding a 2×2 linear transformation & Gilbert Strang
– Coding a 2×2 linear transformation w/ PyTorch
– Hyperbolic tangent
– Rotation + squashing + rotation: ooooh, a neural net
– Rectifying linear unit ReLU
– Shoutout to @vcubingx and his animation
– Spiky transformation: what happen here?
– A *very deep* neural net
– A deep net with tanh
– Summary of today lesson
Taught by
Alfredo Canziani
Tags
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX