Neural Nets - Rotation and Squashing
Offered By: Alfredo Canziani via YouTube
Course Description
Overview
Explore the fundamentals of neural networks in this comprehensive lecture focusing on rotation and squashing operations. Delve into affine transformations and non-linearities, gaining intuitive understanding through visual explanations. Learn to implement 2x2 linear transformations using both Jupyter and PyTorch, and discover the power of activation functions like hyperbolic tangent and ReLU. Witness the construction of deep neural networks step-by-step, from basic components to complex architectures. Gain practical coding experience and theoretical insights, concluding with a thorough summary of key concepts in neural network design and functionality.
Syllabus
– Welcome!
– Affine transformations and non-linearities
– Affine transformation: intuition
– Summary slide
– Jupyter and PyTorch
– Input data
– Coding a 2×2 linear transformation & Gilbert Strang
– Coding a 2×2 linear transformation w/ PyTorch
– Hyperbolic tangent
– Rotation + squashing + rotation: ooooh, a neural net
– Rectifying linear unit ReLU
– Shoutout to @vcubingx and his animation
– Spiky transformation: what happen here?
– A *very deep* neural net
– A deep net with tanh
– Summary of today lesson
Taught by
Alfredo Canziani
Tags
Related Courses
Introduction to Data Science in PythonUniversity of Michigan via Coursera Julia Scientific Programming
University of Cape Town via Coursera Python for Data Science
University of California, San Diego via edX Probability and Statistics in Data Science using Python
University of California, San Diego via edX Introduction to Python: Fundamentals
Microsoft via edX