Parameter Sharing - Recurrent and Convolutional Nets
Offered By: Alfredo Canziani via YouTube
Course Description
Overview
Syllabus
– Welcome to class
– Hypernetworks
– Shared weights
– Parameter sharing ⇒ adding the gradients
– Max and sum reductions
– Recurrent nets
– Unrolling in time
– Vanishing and exploding gradients
– Math on the whiteboard
– RNN tricks
– RNN for differential equations
– GRU
– What is a memory
– LSTM – Long Short-Term Memory net
– Multilayer LSTM
– Attention for sequence to sequence mapping
– Convolutional nets
– Detecting motifs in images
– Convolution definitions
– Backprop through convolutions
– Stride and skip: subsampling and convolution “à trous”
– Convolutional net architecture
– Multiple convolutions
– Vintage ConvNets
– How does the brain interpret images?
– Hubel & Wiesel's model of the visual cortex
– Invariance and equivariance of ConvNets
– In the next episode…
– Training time, iteration cycle, and historical remarks
Taught by
Alfredo Canziani
Tags
Related Courses
Reinforcement Learning for Trading StrategiesNew York Institute of Finance via Coursera Natural Language Processing with Sequence Models
DeepLearning.AI via Coursera Fake News Detection with Machine Learning
Coursera Project Network via Coursera English/French Translator: Long Short Term Memory Networks
Coursera Project Network via Coursera Text Classification Using Word2Vec and LSTM on Keras
Coursera Project Network via Coursera